datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
kozistr/mqa-ko | ---
language:
- ko
license: cc0-1.0
task_categories:
- question-answering
tags:
- mqa
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 541067862
num_examples: 1382378
download_size: 162865210
dataset_size: 541067862
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
* https://huggingface.co/datasets/clips/mqa |
open-llm-leaderboard/details_Charlie911__MultiLoRA-llama2-mmlu | ---
pretty_name: Evaluation run of Charlie911/MultiLoRA-llama2-mmlu
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Charlie911/MultiLoRA-llama2-mmlu](https://huggingface.co/Charlie911/MultiLoRA-llama2-mmlu)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__MultiLoRA-llama2-mmlu\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T20:19:51.603035](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__MultiLoRA-llama2-mmlu/blob/main/results_2024-02-09T20-19-51.603035.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.42939450714393407,\n\
\ \"acc_stderr\": 0.03450029235435365,\n \"acc_norm\": 0.4336173195651683,\n\
\ \"acc_norm_stderr\": 0.03529727761229674,\n \"mc1\": 0.2558139534883721,\n\
\ \"mc1_stderr\": 0.01527417621928336,\n \"mc2\": 0.40926286124406613,\n\
\ \"mc2_stderr\": 0.01393003126171617\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47013651877133106,\n \"acc_stderr\": 0.0145853058400071,\n\
\ \"acc_norm\": 0.5221843003412969,\n \"acc_norm_stderr\": 0.01459700192707614\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5821549492133041,\n\
\ \"acc_stderr\": 0.00492196413387402,\n \"acc_norm\": 0.7759410476000796,\n\
\ \"acc_norm_stderr\": 0.004161089244867776\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.45394736842105265,\n \"acc_stderr\": 0.04051646342874143,\n\
\ \"acc_norm\": 0.45394736842105265,\n \"acc_norm_stderr\": 0.04051646342874143\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4981132075471698,\n \"acc_stderr\": 0.030772653642075657,\n\
\ \"acc_norm\": 0.4981132075471698,\n \"acc_norm_stderr\": 0.030772653642075657\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3699421965317919,\n\
\ \"acc_stderr\": 0.03681229633394319,\n \"acc_norm\": 0.3699421965317919,\n\
\ \"acc_norm_stderr\": 0.03681229633394319\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.03835153954399421,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.03835153954399421\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.023000086859068642,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.023000086859068642\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.43548387096774194,\n\
\ \"acc_stderr\": 0.02820622559150274,\n \"acc_norm\": 0.43548387096774194,\n\
\ \"acc_norm_stderr\": 0.02820622559150274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.46060606060606063,\n \"acc_stderr\": 0.03892207016552013,\n\
\ \"acc_norm\": 0.46060606060606063,\n \"acc_norm_stderr\": 0.03892207016552013\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4898989898989899,\n \"acc_stderr\": 0.035616254886737454,\n \"\
acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.035616254886737454\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6062176165803109,\n \"acc_stderr\": 0.035260770955482405,\n\
\ \"acc_norm\": 0.6062176165803109,\n \"acc_norm_stderr\": 0.035260770955482405\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4282051282051282,\n \"acc_stderr\": 0.025088301454694834,\n\
\ \"acc_norm\": 0.4282051282051282,\n \"acc_norm_stderr\": 0.025088301454694834\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.33613445378151263,\n \"acc_stderr\": 0.030684737115135377,\n\
\ \"acc_norm\": 0.33613445378151263,\n \"acc_norm_stderr\": 0.030684737115135377\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5504587155963303,\n \"acc_stderr\": 0.021327881417823363,\n \"\
acc_norm\": 0.5504587155963303,\n \"acc_norm_stderr\": 0.021327881417823363\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.0316746870682898,\n \"acc_norm\"\
: 0.3148148148148148,\n \"acc_norm_stderr\": 0.0316746870682898\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5245098039215687,\n\
\ \"acc_stderr\": 0.03505093194348798,\n \"acc_norm\": 0.5245098039215687,\n\
\ \"acc_norm_stderr\": 0.03505093194348798\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.569620253164557,\n \"acc_stderr\": 0.032230171959375976,\n\
\ \"acc_norm\": 0.569620253164557,\n \"acc_norm_stderr\": 0.032230171959375976\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4798206278026906,\n\
\ \"acc_stderr\": 0.033530461674123,\n \"acc_norm\": 0.4798206278026906,\n\
\ \"acc_norm_stderr\": 0.033530461674123\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.46564885496183206,\n \"acc_stderr\": 0.04374928560599738,\n\
\ \"acc_norm\": 0.46564885496183206,\n \"acc_norm_stderr\": 0.04374928560599738\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5785123966942148,\n \"acc_stderr\": 0.04507732278775088,\n \"\
acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.04507732278775088\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n\
\ \"acc_stderr\": 0.048129173245368216,\n \"acc_norm\": 0.4537037037037037,\n\
\ \"acc_norm_stderr\": 0.048129173245368216\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.36809815950920244,\n \"acc_stderr\": 0.03789213935838396,\n\
\ \"acc_norm\": 0.36809815950920244,\n \"acc_norm_stderr\": 0.03789213935838396\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.49514563106796117,\n \"acc_stderr\": 0.049505043821289195,\n\
\ \"acc_norm\": 0.49514563106796117,\n \"acc_norm_stderr\": 0.049505043821289195\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5299145299145299,\n\
\ \"acc_stderr\": 0.032697411068124425,\n \"acc_norm\": 0.5299145299145299,\n\
\ \"acc_norm_stderr\": 0.032697411068124425\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.598978288633461,\n\
\ \"acc_stderr\": 0.017526133150124572,\n \"acc_norm\": 0.598978288633461,\n\
\ \"acc_norm_stderr\": 0.017526133150124572\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4508670520231214,\n \"acc_stderr\": 0.026788811931562764,\n\
\ \"acc_norm\": 0.4508670520231214,\n \"acc_norm_stderr\": 0.026788811931562764\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.01435591196476786,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.01435591196476786\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.028431095444176647,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.028431095444176647\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.48231511254019294,\n\
\ \"acc_stderr\": 0.02838032284907713,\n \"acc_norm\": 0.48231511254019294,\n\
\ \"acc_norm_stderr\": 0.02838032284907713\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4660493827160494,\n \"acc_stderr\": 0.02775653525734767,\n\
\ \"acc_norm\": 0.4660493827160494,\n \"acc_norm_stderr\": 0.02775653525734767\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503814,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503814\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3376792698826597,\n\
\ \"acc_stderr\": 0.012078563777145564,\n \"acc_norm\": 0.3376792698826597,\n\
\ \"acc_norm_stderr\": 0.012078563777145564\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.46691176470588236,\n \"acc_stderr\": 0.030306257722468314,\n\
\ \"acc_norm\": 0.46691176470588236,\n \"acc_norm_stderr\": 0.030306257722468314\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.39869281045751637,\n \"acc_stderr\": 0.019808281317449848,\n \
\ \"acc_norm\": 0.39869281045751637,\n \"acc_norm_stderr\": 0.019808281317449848\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4636363636363636,\n\
\ \"acc_stderr\": 0.047764491623961985,\n \"acc_norm\": 0.4636363636363636,\n\
\ \"acc_norm_stderr\": 0.047764491623961985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4530612244897959,\n \"acc_stderr\": 0.03186785930004129,\n\
\ \"acc_norm\": 0.4530612244897959,\n \"acc_norm_stderr\": 0.03186785930004129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.48756218905472637,\n\
\ \"acc_stderr\": 0.03534439848539579,\n \"acc_norm\": 0.48756218905472637,\n\
\ \"acc_norm_stderr\": 0.03534439848539579\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n\
\ \"acc_stderr\": 0.03777798822748018,\n \"acc_norm\": 0.3795180722891566,\n\
\ \"acc_norm_stderr\": 0.03777798822748018\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.03733756969066164,\n\
\ \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.03733756969066164\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2558139534883721,\n\
\ \"mc1_stderr\": 0.01527417621928336,\n \"mc2\": 0.40926286124406613,\n\
\ \"mc2_stderr\": 0.01393003126171617\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7379636937647988,\n \"acc_stderr\": 0.01235894443163756\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11751326762699014,\n \
\ \"acc_stderr\": 0.008870331256489986\n }\n}\n```"
repo_url: https://huggingface.co/Charlie911/MultiLoRA-llama2-mmlu
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|arc:challenge|25_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|gsm8k|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hellaswag|10_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T20-19-51.603035.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T20-19-51.603035.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- '**/details_harness|winogrande|5_2024-02-09T20-19-51.603035.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T20-19-51.603035.parquet'
- config_name: results
data_files:
- split: 2024_02_09T20_19_51.603035
path:
- results_2024-02-09T20-19-51.603035.parquet
- split: latest
path:
- results_2024-02-09T20-19-51.603035.parquet
---
# Dataset Card for Evaluation run of Charlie911/MultiLoRA-llama2-mmlu
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Charlie911/MultiLoRA-llama2-mmlu](https://huggingface.co/Charlie911/MultiLoRA-llama2-mmlu) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Charlie911__MultiLoRA-llama2-mmlu",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T20:19:51.603035](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__MultiLoRA-llama2-mmlu/blob/main/results_2024-02-09T20-19-51.603035.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.42939450714393407,
"acc_stderr": 0.03450029235435365,
"acc_norm": 0.4336173195651683,
"acc_norm_stderr": 0.03529727761229674,
"mc1": 0.2558139534883721,
"mc1_stderr": 0.01527417621928336,
"mc2": 0.40926286124406613,
"mc2_stderr": 0.01393003126171617
},
"harness|arc:challenge|25": {
"acc": 0.47013651877133106,
"acc_stderr": 0.0145853058400071,
"acc_norm": 0.5221843003412969,
"acc_norm_stderr": 0.01459700192707614
},
"harness|hellaswag|10": {
"acc": 0.5821549492133041,
"acc_stderr": 0.00492196413387402,
"acc_norm": 0.7759410476000796,
"acc_norm_stderr": 0.004161089244867776
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.45394736842105265,
"acc_stderr": 0.04051646342874143,
"acc_norm": 0.45394736842105265,
"acc_norm_stderr": 0.04051646342874143
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4981132075471698,
"acc_stderr": 0.030772653642075657,
"acc_norm": 0.4981132075471698,
"acc_norm_stderr": 0.030772653642075657
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3699421965317919,
"acc_stderr": 0.03681229633394319,
"acc_norm": 0.3699421965317919,
"acc_norm_stderr": 0.03681229633394319
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39148936170212767,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.39148936170212767,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03835153954399421,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03835153954399421
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.023000086859068642,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.023000086859068642
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147126,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.43548387096774194,
"acc_stderr": 0.02820622559150274,
"acc_norm": 0.43548387096774194,
"acc_norm_stderr": 0.02820622559150274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.46060606060606063,
"acc_stderr": 0.03892207016552013,
"acc_norm": 0.46060606060606063,
"acc_norm_stderr": 0.03892207016552013
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.035616254886737454,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.035616254886737454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6062176165803109,
"acc_stderr": 0.035260770955482405,
"acc_norm": 0.6062176165803109,
"acc_norm_stderr": 0.035260770955482405
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4282051282051282,
"acc_stderr": 0.025088301454694834,
"acc_norm": 0.4282051282051282,
"acc_norm_stderr": 0.025088301454694834
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.33613445378151263,
"acc_stderr": 0.030684737115135377,
"acc_norm": 0.33613445378151263,
"acc_norm_stderr": 0.030684737115135377
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5504587155963303,
"acc_stderr": 0.021327881417823363,
"acc_norm": 0.5504587155963303,
"acc_norm_stderr": 0.021327881417823363
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.0316746870682898,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.0316746870682898
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5245098039215687,
"acc_stderr": 0.03505093194348798,
"acc_norm": 0.5245098039215687,
"acc_norm_stderr": 0.03505093194348798
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.569620253164557,
"acc_stderr": 0.032230171959375976,
"acc_norm": 0.569620253164557,
"acc_norm_stderr": 0.032230171959375976
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4798206278026906,
"acc_stderr": 0.033530461674123,
"acc_norm": 0.4798206278026906,
"acc_norm_stderr": 0.033530461674123
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.46564885496183206,
"acc_stderr": 0.04374928560599738,
"acc_norm": 0.46564885496183206,
"acc_norm_stderr": 0.04374928560599738
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.04507732278775088,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.04507732278775088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.048129173245368216,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.048129173245368216
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.36809815950920244,
"acc_stderr": 0.03789213935838396,
"acc_norm": 0.36809815950920244,
"acc_norm_stderr": 0.03789213935838396
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.49514563106796117,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.49514563106796117,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5299145299145299,
"acc_stderr": 0.032697411068124425,
"acc_norm": 0.5299145299145299,
"acc_norm_stderr": 0.032697411068124425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.598978288633461,
"acc_stderr": 0.017526133150124572,
"acc_norm": 0.598978288633461,
"acc_norm_stderr": 0.017526133150124572
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.026788811931562764,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.026788811931562764
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.01435591196476786,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.01435591196476786
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.028431095444176647,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.028431095444176647
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.48231511254019294,
"acc_stderr": 0.02838032284907713,
"acc_norm": 0.48231511254019294,
"acc_norm_stderr": 0.02838032284907713
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4660493827160494,
"acc_stderr": 0.02775653525734767,
"acc_norm": 0.4660493827160494,
"acc_norm_stderr": 0.02775653525734767
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503814,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503814
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3376792698826597,
"acc_stderr": 0.012078563777145564,
"acc_norm": 0.3376792698826597,
"acc_norm_stderr": 0.012078563777145564
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.46691176470588236,
"acc_stderr": 0.030306257722468314,
"acc_norm": 0.46691176470588236,
"acc_norm_stderr": 0.030306257722468314
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.39869281045751637,
"acc_stderr": 0.019808281317449848,
"acc_norm": 0.39869281045751637,
"acc_norm_stderr": 0.019808281317449848
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4636363636363636,
"acc_stderr": 0.047764491623961985,
"acc_norm": 0.4636363636363636,
"acc_norm_stderr": 0.047764491623961985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4530612244897959,
"acc_stderr": 0.03186785930004129,
"acc_norm": 0.4530612244897959,
"acc_norm_stderr": 0.03186785930004129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.48756218905472637,
"acc_stderr": 0.03534439848539579,
"acc_norm": 0.48756218905472637,
"acc_norm_stderr": 0.03534439848539579
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.03777798822748018,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.03777798822748018
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.03733756969066164,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.03733756969066164
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2558139534883721,
"mc1_stderr": 0.01527417621928336,
"mc2": 0.40926286124406613,
"mc2_stderr": 0.01393003126171617
},
"harness|winogrande|5": {
"acc": 0.7379636937647988,
"acc_stderr": 0.01235894443163756
},
"harness|gsm8k|5": {
"acc": 0.11751326762699014,
"acc_stderr": 0.008870331256489986
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
gagan3012/arabic-xnli-pairwise | ---
dataset_info:
features:
- name: labels
sequence: int64
- name: sent1
sequence: string
- name: sent2
sequence: string
splits:
- name: train
num_bytes: 70811123
num_examples: 1
- name: test
num_bytes: 850605
num_examples: 1
- name: validation
num_bytes: 415074
num_examples: 1
download_size: 37859272
dataset_size: 72076802
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
BangumiBase/lycorisrecoil | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Lycoris Recoil
This is the image base of bangumi Lycoris Recoil, we detected 31 characters, 2149 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 22 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 67 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 17 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 117 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 120 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 21 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 79 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 36 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 16 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 24 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 11 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 21 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 11 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 10 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 10 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 118 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 10 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 54 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 50 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 23 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 10 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 9 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 407 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 13 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 102 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 9 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 27 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 510 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 33 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 27 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 165 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
Ti-Ma/TiMaGPT2-2019 | ---
license: other
license_name: paracrawl-license
license_link: LICENSE
---
|
DrNicefellow/Quality_WorryFree_GeneralQA_Chat_Dataset-v1 | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 37678642
num_examples: 21982
download_size: 16914098
dataset_size: 37678642
---
# Dr. Nicefollows's Worry Free General Chat Dataset v1
## Overview
This dataset contains high-quality general chat samples questions and answers. It is designed following the LIMA: Less Is More for Alignment principle from MetaAI: emphasizing the importance of quality over quantity in training data. Despite its modest size, the dataset's quality ensures its effectiveness in training and fine-tuning conversational AI models.
In this version, each chat has one user query and assistant answer. In the next version, it will become a conversation of multiple rounds.
## Dataset Format
The dataset is structured in the Vicuna 1.1 format, featuring one-round chats. This format is chosen for its compatibility with various conversational AI training paradigms and its efficiency in representing dialogues.
## Volume
The dataset comprises a few thousand chat samples. Each sample has been carefully curated to ensure the highest quality, aligning with the LIMA principle.
## Licensing
Our dataset is worry-free regarding proprietary issues, as it is not automatically generated by a proprietary chatbot. This dataset is released under the Apache License 2.0. This license allows for broad freedom in usage and modification, provided that proper credit is given and changes are documented. For full license terms, please refer to the LICENSE file.
## Use Case
This dataset is ideal for training conversational AI models. It can help in developing chatbots or virtual assistants capable of handling a wide range of queries with high accuracy. To use the dataset for finetuning a model with Axolotl, simply add the following to the .yml file:
datasets:
- path: DrNicefellow/Quality_WorryFree_GeneralQA_Chat_Dataset-v1
- type: completion
## Feeling Generous? 😊
Eager to buy me a cup of 2$ coffe or iced tea?🍵☕ Sure, here is the link: [https://ko-fi.com/drnicefellow](https://ko-fi.com/drnicefellow). Please add a note on which one you want me to drink? |
xiaxiaoqian/model | ---
license: mit
---
|
jsqihui/detective-dataset-en | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 746174
num_examples: 74
download_size: 409863
dataset_size: 746174
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alvations/c4p0-v2-en-de | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: target_backto_source
dtype: string
- name: raw_target
list:
- name: generated_text
dtype: string
- name: raw_target_backto_source
list:
- name: generated_text
dtype: string
- name: prompt
dtype: string
- name: reverse_prompt
dtype: string
- name: source_langid
dtype: string
- name: target_langid
dtype: string
- name: target_backto_source_langid
dtype: string
- name: doc_id
dtype: int64
- name: sent_id
dtype: int64
- name: timestamp
dtype: string
- name: url
dtype: string
- name: doc_hash
dtype: string
- name: dataset
dtype: string
- name: source_lang
dtype: string
- name: target_lang
dtype: string
splits:
- name: train
num_bytes: 41900834
num_examples: 34234
download_size: 19737322
dataset_size: 41900834
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Mikivis__gpt2-large-lora-stf4 | ---
pretty_name: Evaluation run of Mikivis/gpt2-large-lora-stf4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Mikivis/gpt2-large-lora-stf4](https://huggingface.co/Mikivis/gpt2-large-lora-stf4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mikivis__gpt2-large-lora-stf4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-27T23:48:52.785657](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikivis__gpt2-large-lora-stf4/blob/main/results_2023-10-27T23-48-52.785657.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003460570469798658,\n\
\ \"em_stderr\": 0.0006013962884271089,\n \"f1\": 0.07443372483221503,\n\
\ \"f1_stderr\": 0.0016782330994195233,\n \"acc\": 0.26795580110497236,\n\
\ \"acc_stderr\": 0.007008096716979156\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.003460570469798658,\n \"em_stderr\": 0.0006013962884271089,\n\
\ \"f1\": 0.07443372483221503,\n \"f1_stderr\": 0.0016782330994195233\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5359116022099447,\n\
\ \"acc_stderr\": 0.014016193433958312\n }\n}\n```"
repo_url: https://huggingface.co/Mikivis/gpt2-large-lora-stf4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|arc:challenge|25_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_27T23_48_52.785657
path:
- '**/details_harness|drop|3_2023-10-27T23-48-52.785657.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-27T23-48-52.785657.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_27T23_48_52.785657
path:
- '**/details_harness|gsm8k|5_2023-10-27T23-48-52.785657.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-27T23-48-52.785657.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hellaswag|10_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_27T23_48_52.785657
path:
- '**/details_harness|winogrande|5_2023-10-27T23-48-52.785657.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-27T23-48-52.785657.parquet'
- config_name: results
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- results_2023-09-12T03-05-07.244584.parquet
- split: 2023_10_27T23_48_52.785657
path:
- results_2023-10-27T23-48-52.785657.parquet
- split: latest
path:
- results_2023-10-27T23-48-52.785657.parquet
---
# Dataset Card for Evaluation run of Mikivis/gpt2-large-lora-stf4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Mikivis/gpt2-large-lora-stf4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Mikivis/gpt2-large-lora-stf4](https://huggingface.co/Mikivis/gpt2-large-lora-stf4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mikivis__gpt2-large-lora-stf4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-27T23:48:52.785657](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikivis__gpt2-large-lora-stf4/blob/main/results_2023-10-27T23-48-52.785657.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.003460570469798658,
"em_stderr": 0.0006013962884271089,
"f1": 0.07443372483221503,
"f1_stderr": 0.0016782330994195233,
"acc": 0.26795580110497236,
"acc_stderr": 0.007008096716979156
},
"harness|drop|3": {
"em": 0.003460570469798658,
"em_stderr": 0.0006013962884271089,
"f1": 0.07443372483221503,
"f1_stderr": 0.0016782330994195233
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5359116022099447,
"acc_stderr": 0.014016193433958312
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
MatsuoDochiai/Mae | ---
license: openrail
---
|
deepharborAI/Hindi-Niband | ---
license: mit
task_categories:
- text-generation
- table-question-answering
- summarization
language:
- hi
pretty_name: 'Niband '
size_categories:
- 100M<n<1B
---
### Dataset Name: Hindi- Niband (Massive Hindi language Text Dataset)
#### Dataset Overview
This dataset is a comprehensive collection of text data consisting of more than 10 billion tokens. It encompasses a wide range of sources, including Wikipedia articles, news articles, email transcripts, and generated prompt text. Specific Hindi language data columns have been extracted from the CulturaX dataset, which is a large, cleaned, and multilingual dataset for large language models. We acknowledge and cite the CulturaX dataset using the provided citation.
#### Data Sources
1. **Wikipedia Articles:** A large corpus of text extracted from Wikipedia articles covering various topics and domains.
2. **News Articles:** Textual data sourced from news articles from diverse sources and regions.
3. **Email Transcripts:** Transcripts of email communications, providing insights into natural language usage in electronic correspondence.
4. **Prompt Text Generation:** Text generated from prompts or prompts used to generate text, contributing to the dataset's diversity and complexity.
5. **Hindi Data from CulturaX Dataset:** Specific Hindi language data columns have been extracted from the CulturaX dataset, which is a large, cleaned, and multilingual dataset for large language models.
#### Potential Uses
- Training and evaluating natural language generation models in the Hindi language domain.
- Exploring the capabilities of models in narrative generation tasks.
- Conducting research on narrative understanding and generation in Hindi.
- Analyzing sentiment and opinion mining in Hindi text data.
- Building chatbots or virtual assistants capable of interacting in Hindi.
- Creating educational resources for teaching Hindi language and literature.
- Developing machine translation systems for translating between Hindi and other languages.
- Studying cross-lingual transfer learning techniques for improving natural language processing tasks in Hindi.
#### Importance for Indian Native Languages :-
This dataset can be crucial for the training of LLM (Large Language Model) models and aiming to explore the capabilities of those natural language generation models in Hindi. It serves as a foundation for training and evaluating models capable of producing coherent and contextually relevant narratives or explanations. Additionally, this dataset aligns with our commitment to promoting Indian native languages on a global scale. We recognize the limited availability of such datasets as a major challenge for innovation within the local Indian community. As part of our contribution to the Indian open-source community, we are planning to release a very large database covering various Indian native languages. This initiative aims to empower researchers, practitioners, and developers to explore and innovate in Indian language processing and generation tasks.
#### Citation
If you use this dataset in your research or applications, please consider citing the CulturaX dataset using the provided citation.
We acknowledge and cite the CulturaX dataset using the following citation:
```
@misc{nguyen2023culturax,
title={CulturaX: A Cleaned, Enormous, and Multilingual Dataset for Large Language Models in 167 Languages},
author={Thuat Nguyen and Chien Van Nguyen and Viet Dac Lai and Hieu Man and Nghia Trung Ngo and Franck Dernoncourt and Ryan A. Rossi and Thien Huu Nguyen},
year={2023},
eprint={2309.09400},
archivePrefix={arXiv},
primaryClass={cs.CL}.
```
Additionally, the dataset includes news article data, and we acknowledge and cite the source of this data using the following citations:
```
@inproceedings{see-etal-2017-get,
title = "Get To The Point: Summarization with Pointer-Generator Networks",
author = "See, Abigail and
Liu, Peter J. and
Manning, Christopher D.",
booktitle = "Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2017",
address = "Vancouver, Canada",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/P17-1099",
doi = "10.18653/v1/P17-1099",
pages = "1073--1083",
abstract = "Neural sequence-to-sequence models have provided a viable new approach for abstractive text summarization (meaning they are not restricted to simply selecting and rearranging passages from the original text). However, these models have two shortcomings: they are liable to reproduce factual details inaccurately, and they tend to repeat themselves. In this work we propose a novel architecture that augments the standard sequence-to-sequence attentional model in two orthogonal ways. First, we use a hybrid pointer-generator network that can copy words from the source text via pointing, which aids accurate reproduction of information, while retaining the ability to produce novel words through the generator. Second, we use coverage to keep track of what has been summarized, which discourages repetition. We apply our model to the CNN / Daily Mail summarization task, outperforming the current abstractive state-of-the-art by at least 2 ROUGE points.",}
@inproceedings{DBLP:conf/nips/HermannKGEKSB15,
author={Karl Moritz Hermann and Tomás Kociský and Edward Grefenstette and Lasse Espeholt and Will Kay and Mustafa Suleyman and Phil Blunsom},
title={Teaching Machines to Read and Comprehend},
year={2015},
cdate={1420070400000},
pages={1693-1701},
url={http://papers.nips.cc/paper/5945-teaching-machines-to-read-and-comprehend},
booktitle={NIPS},
crossref={conf/nips/2015}
}
```
#### License
Please refer to the licensing terms specified by the dataset creators.
#### Disclaimer
The views expressed in the dataset do not necessarily reflect the views of the dataset creators or contributors. Users are advised to use the data responsibly and in accordance with ethical guidelines.
This dataset card provides an overview of the massive multilingual text dataset, highlighting its sources, potential uses, citation, and disclaimer. |
ravithejads/ms_marco_hi | ---
dataset_info:
features:
- name: answers
sequence: string
- name: passages
sequence:
- name: is_selected
dtype: int32
- name: passage_text
dtype: string
- name: url
dtype: string
- name: query
dtype: string
- name: query_id
dtype: int32
- name: query_type
dtype: string
- name: wellFormedAnswers
sequence: string
- name: query_hi
dtype: string
- name: answers_hi
dtype: string
- name: passage_text_hi
sequence: string
splits:
- name: test
num_bytes: 129079041
num_examples: 9650
download_size: 49278186
dataset_size: 129079041
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
FranzderPapst/squad_x_boolq | ---
license: mit
language:
- en
task_categories:
- text-classification
pretty_name: warrgalbhalble
size_categories:
- 1K<n<10K
---
# ABOUT
Wanted to train a model to classify question, if they are open ore boolean. So I merged SQuAD with BoolQ, the dataset contains 5000 question of each dataset, labeled with "true" (the boolean question) and with "false" (the open questions). Didn't add questions that don't fall into these categories. May be a flaw, we'll see:).
For some reason the dataset viewer isn't working, sorry for that one, but here's a snippet of the json structure:
{
"question": "are there fiber optic cables under the ocean",
"type": "true"
},
{
"question": "are dollar general and dollar tree owned by the same company",
"type": "true"
}, |
Mr-aio/All-Isa-AF |
---
language:
- en
size_categories:
- 1K<n<10K
--- |
NeuralNovel/Unsloth-DPO | ---
language:
- en
license: apache-2.0
---
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Data Card</title>
<link href="https://fonts.googleapis.com/css2?family=Quicksand:wght@400;500;600&display=swap" rel="stylesheet">
<style>
body {
font-family: 'Quicksand', sans-serif;
background-color: #1A202C;
color: #D8DEE9;
margin: 0;
padding: 0; /* Remove default padding */
font-size: 26px;
background: linear-gradient(135deg, #2E3440 0%, #1A202C 100%);
}
p {
padding-left: 10px
}
.container {
width: 100%;
margin: auto;
background-color: rgb(255 255 255 / 1%);
padding: 20px 30px 40px; /* Add padding below the image only */
padding-right: 32px;
border-radius: 12px;
box-shadow: 0 4px 10px rgba(0, 0, 0, 0.2);
backdrop-filter: blur(10px);
border: 1px solid rgba(255, 255, 255, 0.05);
background-color: rgb(0 0 0 / 75%) !important;
}
.header h1 {
font-size: 28px;
color: #fff; /* White text color */
margin: 0;
text-shadow:
-1px -1px 0 #000,
1px -1px 0 #000,
-1px 1px 0 #000,
1px 1px 0 #000; /* Black outline */
}
.header {
display: flex;
align-items: center;
justify-content: space-between;
gap: 20px;
}
img {
border-radius: 10px 10px 0 0!important;
padding-left: 0px !important;
}
.header h1 {
font-size: 28px;
color: #ECEFF4;
margin: 0;
text-shadow: 2px 2px 4px rgba(0, 0, 0, 0.3);
}
.info {
background-color: rgba(255, 255, 255, 0.05);
color: #AEBAC7;
border-radius: 12px;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.2);
font-size: 14px;
line-height: 1.6;
margin-left: 5px;
overflow-x: auto;
margin-top: 20px; /* Adjusted margin */
border: 1px solid rgba(255, 255, 255, 0.05);
transition: background-color 0.6s ease; /* Smooth transition over 0.5 seconds */
}
.info:hover {
}
.info img {
width: 100%;
border-radius: 10px 10px 0 0;
margin-top: -20px; /* Negative margin to overlap container margin */
}
a {
color: #88C0D0;
text-decoration: none;
transition: color 0.3s ease;
position: relative;
}
a:hover {
color: #A3BE8C;
text-decoration: none;
}
a::before {
content: '';
position: absolute;
width: 100%;
height: 2px;
bottom: 0;
left: 0;
background-color: #A3BE8C;
visibility: hidden;
transform: scaleX(0);
transition: all 0.3s ease-in-out;
}
a:hover::before {
visibility: visible;
transform: scaleX(1);
}
.button {
display: inline-block;
background-color: #5E81AC;
color: #E5E9F0;
padding: 10px 20px;
border-radius: 5px;
cursor: pointer;
text-decoration: none;
transition: background-color 0.3s ease;
}
.button:hover {
background-color: #81A1C1;
}
.hf-sanitized.hf-sanitized-oJB5trHYB93-j8lDfGQn3 .container {
}
</style>
</head>
<body>
<div class="container">
<div class="header">
<h1>Unsloth-DPO</h1>
</div>
<div class="info">
<img src="https://i.ibb.co/hY42ZY7/OIG4-8.jpg" style="border-radius: 10px;">
<p><strong>Creator:</strong> <a href="https://huggingface.co/NeuralNovel" target="_blank">NeuralNovel</a></p>
<p><strong>Community Organization:</strong> <a href="https://huggingface.co/ConvexAI" target="_blank">ConvexAI</a></p>
<p><strong>Discord:</strong> <a href="https://discord.gg/rJXGjmxqzS" target="_blank">Join us on Discord</a></p>
<p><strong>Special Thanks: <a href ="https://unsloth.ai/" target="_blank"> Unsloth.ai</strong></a></p>
</head>
<body>
<div>
<div>
<p><strong>About Neural-DPO:</strong> The Unsloth-DPO dataset, inspired by orca_dpo_pairs. This dataset features questions and answers pairs, with a direct focus on Unsloth.ai.</p>
<p><strong>Source Data:</strong></p>
<ul>
<li>orca_dpo_pairs (Inspiration)</li>
<li>Make LLM Fine-tuning 2x faster with Unsloth and 🤗 TRL</li>
<li>unsloth.ai/blog/mistral-benchmark</li>
</ul>
<p><strong>Phrases Removed:</strong></p>
<p>To enhance the dataset's coherence and relevance across varied contexts, certain phrases have been selectively omitted.</p>
<ul>
<li>Couldn't help but</li>
<li>Can't resist</li>
<li>I'm sorry, but</li>
<li>As an AI</li>
<li>However, it is important to</li>
<li>Cannot provide</li>
<li>And others</li>
</ul>
</div>
</div>
</body> |
gallantVN/en_vi_DPO | ---
license: apache-2.0
task_categories:
- translation
- reinforcement-learning
size_categories:
- 10K<n<100K
--- |
autoevaluate/autoeval-eval-lener_br-lener_br-2a71c5-1777061681 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- lener_br
eval_info:
task: entity_extraction
model: pierreguillou/ner-bert-large-cased-pt-lenerbr
metrics: []
dataset_name: lener_br
dataset_config: lener_br
dataset_split: validation
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: pierreguillou/ner-bert-large-cased-pt-lenerbr
* Dataset: lener_br
* Config: lener_br
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Luciano](https://huggingface.co/Luciano) for evaluating this model. |
ktmeng/mec | ---
license: mit
---
|
ola13/small-the_pile | ---
dataset_info:
features:
- name: text
dtype: string
- name: meta
struct:
- name: perplexity_score
dtype: float64
- name: pile_set_name
dtype: string
splits:
- name: train
num_bytes: 606056668
num_examples: 100000
download_size: 328667964
dataset_size: 606056668
---
# Dataset Card for "small-the_pile"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/Food101_test_google_flan_t5_xl_mode_A_ns_25250 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 10610673
num_examples: 25250
download_size: 1146498
dataset_size: 10610673
---
# Dataset Card for "Food101_test_google_flan_t5_xl_mode_A_ns_25250"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jkv53/13F_Reports_with_labels | ---
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 12642773
num_examples: 1113
download_size: 3334911
dataset_size: 12642773
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "13F_Reports_with_labels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kenanjeff/ComVG | ---
license: creativeml-openrail-m
task_categories:
- zero-shot-classification
tags:
- code
size_categories:
- 1K<n<10K
---
Compositional Visual Genome (ComVG) <br/>
ComVG benchmark aims to test vision-language models ability in text-to-image retrieval. <br/>
We selected 542 high-quality images from Visual Genome and created 5400 datapoints in ComVG. <br/>
Each datapoint contains a postive and negative image. The negative image is a mutated
variant with singular discrepancies in subject, object, or predicate.<br/>
For more details on creation process, please refer: https://arxiv.org/abs/2211.13854
|
open-llm-leaderboard/details_922CA__Silicon-Monika-7b | ---
pretty_name: Evaluation run of 922CA/Silicon-Monika-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [922CA/Silicon-Monika-7b](https://huggingface.co/922CA/Silicon-Monika-7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_922CA__Silicon-Monika-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T12:04:26.776279](https://huggingface.co/datasets/open-llm-leaderboard/details_922CA__Silicon-Monika-7b/blob/main/results_2024-02-29T12-04-26.776279.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6283372065747487,\n\
\ \"acc_stderr\": 0.03245805597450051,\n \"acc_norm\": 0.6301488013494124,\n\
\ \"acc_norm_stderr\": 0.03311159741284949,\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756544,\n \"mc2\": 0.5214295545813499,\n\
\ \"mc2_stderr\": 0.015004393759780037\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5972696245733788,\n \"acc_stderr\": 0.014332236306790152,\n\
\ \"acc_norm\": 0.6313993174061433,\n \"acc_norm_stderr\": 0.014097810678042194\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6231826329416451,\n\
\ \"acc_stderr\": 0.004835981632401601,\n \"acc_norm\": 0.8264289982075284,\n\
\ \"acc_norm_stderr\": 0.003779661224651475\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663434,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663434\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155243,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155243\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.024251071262208837,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.024251071262208837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396993,\n\
\ \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396993\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871923,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871923\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135363,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135363\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399306,\n \"\
acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399306\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.024662496845209818,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.024662496845209818\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249612,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249612\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.02633661346904663,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.02633661346904663\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868062,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868062\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4517601043024772,\n\
\ \"acc_stderr\": 0.012710662233660245,\n \"acc_norm\": 0.4517601043024772,\n\
\ \"acc_norm_stderr\": 0.012710662233660245\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.02881472242225419,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.02881472242225419\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6372549019607843,\n \"acc_stderr\": 0.019450768432505514,\n \
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.019450768432505514\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756544,\n \"mc2\": 0.5214295545813499,\n\
\ \"mc2_stderr\": 0.015004393759780037\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6050037907505686,\n \
\ \"acc_stderr\": 0.013465354969973198\n }\n}\n```"
repo_url: https://huggingface.co/922CA/Silicon-Monika-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|arc:challenge|25_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|gsm8k|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hellaswag|10_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T12-04-26.776279.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T12-04-26.776279.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- '**/details_harness|winogrande|5_2024-02-29T12-04-26.776279.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T12-04-26.776279.parquet'
- config_name: results
data_files:
- split: 2024_02_29T12_04_26.776279
path:
- results_2024-02-29T12-04-26.776279.parquet
- split: latest
path:
- results_2024-02-29T12-04-26.776279.parquet
---
# Dataset Card for Evaluation run of 922CA/Silicon-Monika-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [922CA/Silicon-Monika-7b](https://huggingface.co/922CA/Silicon-Monika-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_922CA__Silicon-Monika-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T12:04:26.776279](https://huggingface.co/datasets/open-llm-leaderboard/details_922CA__Silicon-Monika-7b/blob/main/results_2024-02-29T12-04-26.776279.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6283372065747487,
"acc_stderr": 0.03245805597450051,
"acc_norm": 0.6301488013494124,
"acc_norm_stderr": 0.03311159741284949,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756544,
"mc2": 0.5214295545813499,
"mc2_stderr": 0.015004393759780037
},
"harness|arc:challenge|25": {
"acc": 0.5972696245733788,
"acc_stderr": 0.014332236306790152,
"acc_norm": 0.6313993174061433,
"acc_norm_stderr": 0.014097810678042194
},
"harness|hellaswag|10": {
"acc": 0.6231826329416451,
"acc_stderr": 0.004835981632401601,
"acc_norm": 0.8264289982075284,
"acc_norm_stderr": 0.003779661224651475
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663434,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663434
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155243,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155243
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396993,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396993
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871923,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871923
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135363,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135363
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.016595259710399306,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.016595259710399306
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.024662496845209818,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.024662496845209818
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249612,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249612
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.02633661346904663,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.02633661346904663
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.025557653981868062,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.025557653981868062
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4517601043024772,
"acc_stderr": 0.012710662233660245,
"acc_norm": 0.4517601043024772,
"acc_norm_stderr": 0.012710662233660245
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.02881472242225419,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.02881472242225419
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.019450768432505514,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.019450768432505514
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835816,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835816
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756544,
"mc2": 0.5214295545813499,
"mc2_stderr": 0.015004393759780037
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939324
},
"harness|gsm8k|5": {
"acc": 0.6050037907505686,
"acc_stderr": 0.013465354969973198
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
chats-bug/multiple-subject-gen | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
- name: subject_lines
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 78493229
num_examples: 59489
- name: test
num_bytes: 4030472
num_examples: 3132
download_size: 10833380
dataset_size: 82523701
---
# Dataset Card for "multiple-subject-gen"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fewshot-goes-multilingual/cs_facebook-comments | ---
annotations_creators:
- found
language:
- cs
language_creators:
- found
license:
- cc-by-nc-sa-3.0
multilinguality:
- monolingual
pretty_name: Czech Facebook comments
size_categories:
- 10K<n<100K
source_datasets:
- original
tags: []
task_categories:
- text-classification
task_ids:
- sentiment-classification
---
# Dataset Card for Czech Facebook comments
## Dataset Description
The dataset contains user comments from Facebook. Each comment contains text, sentiment (positive/negative/neutral).
The dataset has in total (train+validation+test) 6,600 reviews. The data is balanced.
## Dataset Features
Each sample contains:
- `comment_id`: unique string identifier of the comment.
- `sentiment_str`: string representation of the rating - "pozitivní" / "neutrální" / "negativní"
- `sentiment_int`: integer representation of the rating (1=positive, 0=neutral, -1=negative)
- `comment`: the string of the comment
## Dataset Source
The data is a processed adaptation of [Facebook CZ Corpus](https://liks.fav.zcu.cz/sentiment/).
This adaptation is label-balanced.
|
GEM/wiki_cat_sum | ---
annotations_creators:
- automatically-created
language_creators:
- unknown
language:
- en
license:
- cc-by-sa-3.0
multilinguality:
- unknown
size_categories:
- unknown
source_datasets:
- original
task_categories:
- summarization
task_ids: []
pretty_name: wiki_cat_sum
---
# Dataset Card for GEM/wiki_cat_sum
## Dataset Description
- **Homepage:** https://github.com/lauhaide/WikiCatSum
- **Repository:** https://datashare.ed.ac.uk/handle/10283/3368
- **Paper:** https://arxiv.org/abs/1906.04687
- **Leaderboard:** N/A
- **Point of Contact:** Laura Perez-Beltrachini
### Link to Main Data Card
You can find the main data card on the [GEM Website](https://gem-benchmark.com/data_cards/wiki_cat_sum).
### Dataset Summary
WikiCatSum is an English summarization dataset in three domains: animals, companies, and film. It provides multiple paragraphs of text paired with a summary of the paragraphs.
You can load the dataset via:
```
import datasets
data = datasets.load_dataset('GEM/wiki_cat_sum')
```
The data loader can be found [here](https://huggingface.co/datasets/GEM/wiki_cat_sum).
#### website
[Github](https://github.com/lauhaide/WikiCatSum)
#### paper
[Arxiv](https://arxiv.org/abs/1906.04687)
#### authors
Laura Perez-Beltrachini, Yang Liu, Mirella Lapata (University of Edinburgh) Peter J. Liu, Mohammad Saleh, Etienne Pot, Ben Goodrich, Ryan Sepassi, Lukasz Kaiser, Noam Shazeer (GoogleBrain)
## Dataset Overview
### Where to find the Data and its Documentation
#### Webpage
<!-- info: What is the webpage for the dataset (if it exists)? -->
<!-- scope: telescope -->
[Github](https://github.com/lauhaide/WikiCatSum)
#### Download
<!-- info: What is the link to where the original dataset is hosted? -->
<!-- scope: telescope -->
[Website](https://datashare.ed.ac.uk/handle/10283/3368)
#### Paper
<!-- info: What is the link to the paper describing the dataset (open access preferred)? -->
<!-- scope: telescope -->
[Arxiv](https://arxiv.org/abs/1906.04687)
#### BibTex
<!-- info: Provide the BibTex-formatted reference for the dataset. Please use the correct published version (ACL anthology, etc.) instead of google scholar created Bibtex. -->
<!-- scope: microscope -->
```
@inproceedings{perez-beltrachini-etal-2019-generating,
title = "Generating Summaries with Topic Templates and Structured Convolutional Decoders",
author = "Perez-Beltrachini, Laura and
Liu, Yang and
Lapata, Mirella",
booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2019",
address = "Florence, Italy",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/P19-1504",
doi = "10.18653/v1/P19-1504",
}
```
#### Contact Name
<!-- quick -->
<!-- info: If known, provide the name of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
Laura Perez-Beltrachini
#### Contact Email
<!-- info: If known, provide the email of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
lperez@ed.ac.uk
#### Has a Leaderboard?
<!-- info: Does the dataset have an active leaderboard? -->
<!-- scope: telescope -->
no
### Languages and Intended Use
#### Multilingual?
<!-- quick -->
<!-- info: Is the dataset multilingual? -->
<!-- scope: telescope -->
no
#### Covered Languages
<!-- quick -->
<!-- info: What languages/dialects are covered in the dataset? -->
<!-- scope: telescope -->
`English`
#### License
<!-- quick -->
<!-- info: What is the license of the dataset? -->
<!-- scope: telescope -->
cc-by-sa-3.0: Creative Commons Attribution Share Alike 3.0 Unported
#### Intended Use
<!-- info: What is the intended use of the dataset? -->
<!-- scope: microscope -->
Research on multi-document abstractive summarisation.
#### Primary Task
<!-- info: What primary task does the dataset support? -->
<!-- scope: telescope -->
Summarization
#### Communicative Goal
<!-- quick -->
<!-- info: Provide a short description of the communicative goal of a model trained for this task on this dataset. -->
<!-- scope: periscope -->
Summarise the most important facts of a given entity in the Film, Company, and Animal domains from a cluster of related documents.
### Credit
#### Curation Organization Type(s)
<!-- info: In what kind of organization did the dataset curation happen? -->
<!-- scope: telescope -->
`industry`, `academic`
#### Curation Organization(s)
<!-- info: Name the organization(s). -->
<!-- scope: periscope -->
Google Cloud Platform, University of Edinburgh
#### Dataset Creators
<!-- info: Who created the original dataset? List the people involved in collecting the dataset and their affiliation(s). -->
<!-- scope: microscope -->
Laura Perez-Beltrachini, Yang Liu, Mirella Lapata (University of Edinburgh) Peter J. Liu, Mohammad Saleh, Etienne Pot, Ben Goodrich, Ryan Sepassi, Lukasz Kaiser, Noam Shazeer (GoogleBrain)
#### Funding
<!-- info: Who funded the data creation? -->
<!-- scope: microscope -->
Google Cloud Platform, European Research Council
#### Who added the Dataset to GEM?
<!-- info: Who contributed to the data card and adding the dataset to GEM? List the people+affiliations involved in creating this data card and who helped integrate this dataset into GEM. -->
<!-- scope: microscope -->
Ronald Cardenas (University of Edinburgh) Laura Perez-Beltrachini (University of Edinburgh)
### Dataset Structure
#### Data Fields
<!-- info: List and describe the fields present in the dataset. -->
<!-- scope: telescope -->
- `id`: ID of the data example
- `title`: Is the Wikipedia article's title
- `paragraphs`: Is the ranked list of paragraphs from the set of crawled texts
- `summary`: Is constituted by a list of sentences together with their corresponding topic label
#### Example Instance
<!-- info: Provide a JSON formatted example of a typical instance in the dataset. -->
<!-- scope: periscope -->
This is a truncated example from the animal setting:
```
{'gem_id': 'animal-train-1',
'gem_parent_id': 'animal-train-1',
'id': '2652',
'paragraphs': ["lytrosis (hulst) of louisiana vernon antoine brou jr. 2005. southern lepidopterists' news, 27: 7 ., ..."],
'references': ['lytrosis unitaria , the common lytrosis moth, is a species of moth of the geometridae family. it is found in north america, including arkansas, georgia, iowa , massachusetts, and wisconsin. the wingspan is about 50 mm. the larvae feed on rosa, crataegus, amelanchier, acer, quercus and viburnum species.'],
'summary': {'text': ['lytrosis unitaria , the common lytrosis moth , is a species of moth of the geometridae family .',
'it is found in north america , including arkansas , georgia , iowa , massachusetts , new hampshire , new jersey , new york , north carolina , ohio , oklahoma , ontario , pennsylvania , south carolina , tennessee , texas , virginia , west virginia and wisconsin .',
'the wingspan is about 50 mm .',
'the larvae feed on rosa , crataegus , amelanchier , acer , quercus and viburnum species . '],
'topic': [29, 20, 9, 8]},
'target': 'lytrosis unitaria , the common lytrosis moth, is a species of moth of the geometridae family. it is found in north america, including arkansas, georgia, iowa , massachusetts, and wisconsin. the wingspan is about 50 mm. the larvae feed on rosa, crataegus, amelanchier, acer, quercus and viburnum species.',
'title': 'lytrosis unitaria'}
```
#### Data Splits
<!-- info: Describe and name the splits in the dataset if there are more than one. -->
<!-- scope: periscope -->
Nb of instances in train/valid/test are 50,938/2,855/2,831
#### Splitting Criteria
<!-- info: Describe any criteria for splitting the data, if used. If there are differences between the splits (e.g., if the training annotations are machine-generated and the dev and test ones are created by humans, or if different numbers of annotators contributed to each example), describe them here. -->
<!-- scope: microscope -->
The data was split i.i.d., i.e. uniformly split into training, validation, and test datasets.
## Dataset in GEM
### Rationale for Inclusion in GEM
#### Why is the Dataset in GEM?
<!-- info: What does this dataset contribute toward better generation evaluation and why is it part of GEM? -->
<!-- scope: microscope -->
Evaluation of models' performance on noisy (document, summary) pairs and long inputs.
Evaluate models' capabilities to generalise and mitigate biases.
#### Similar Datasets
<!-- info: Do other datasets for the high level task exist? -->
<!-- scope: telescope -->
no
#### Unique Language Coverage
<!-- info: Does this dataset cover other languages than other datasets for the same task? -->
<!-- scope: periscope -->
no
#### Ability that the Dataset measures
<!-- info: What aspect of model ability can be measured with this dataset? -->
<!-- scope: periscope -->
Capabilities to generalise, mitigate biases, factual correctness.
### GEM-Specific Curation
#### Modificatied for GEM?
<!-- info: Has the GEM version of the dataset been modified in any way (data, processing, splits) from the original curated data? -->
<!-- scope: telescope -->
yes
#### GEM Modifications
<!-- info: What changes have been made to he original dataset? -->
<!-- scope: periscope -->
`annotations added`
#### Modification Details
<!-- info: For each of these changes, described them in more details and provided the intended purpose of the modification -->
<!-- scope: microscope -->
We provide topic labels for summary sentences.
#### Additional Splits?
<!-- info: Does GEM provide additional splits to the dataset? -->
<!-- scope: telescope -->
no
### Getting Started with the Task
#### Pointers to Resources
<!-- info: Getting started with in-depth research on the task. Add relevant pointers to resources that researchers can consult when they want to get started digging deeper into the task. -->
<!-- scope: microscope -->
- [Generating Wikipedia by Summarizing Long Sequences](https://arxiv.org/abs/1801.10198)
- [Generating Summaries with Topic Templates and Structured Convolutional Decoders](https://arxiv.org/abs/1906.04687)
- [Noisy Self-Knowledge Distillation for Text Summarization](https://arxiv.org/abs/2009.07032)
And all references in these papers.
## Previous Results
### Previous Results
#### Measured Model Abilities
<!-- info: What aspect of model ability can be measured with this dataset? -->
<!-- scope: telescope -->
Capabilities to generalise, mitigate biases, factual correctness.
#### Metrics
<!-- info: What metrics are typically used for this task? -->
<!-- scope: periscope -->
`ROUGE`, `BERT-Score`, `MoverScore`, `Other: Other Metrics`
#### Other Metrics
<!-- info: Definitions of other metrics -->
<!-- scope: periscope -->
- Abstract/Copy
- Factual accuracy based on the score of (Goodrich et al., 2019) and the relation extraction system of (Sorokin and Gurevych, 2017).
#### Proposed Evaluation
<!-- info: List and describe the purpose of the metrics and evaluation methodology (including human evaluation) that the dataset creators used when introducing this task. -->
<!-- scope: microscope -->
Human based are Question Answering and Ranking (Content, Fluency and Repetition)
#### Previous results available?
<!-- info: Are previous results available? -->
<!-- scope: telescope -->
yes
#### Other Evaluation Approaches
<!-- info: What evaluation approaches have others used? -->
<!-- scope: periscope -->
Those listed above.
#### Relevant Previous Results
<!-- info: What are the most relevant previous results for this task/dataset? -->
<!-- scope: microscope -->
Generating Summaries with Topic Templates and Structured Convolutional Decoders
https://arxiv.org/abs/1906.04687
Noisy Self-Knowledge Distillation for Text Summarization
https://arxiv.org/abs/2009.07032
## Dataset Curation
### Original Curation
#### Original Curation Rationale
<!-- info: Original curation rationale -->
<!-- scope: telescope -->
The dataset is a subset of the WikiSum (Liu et al., 2018) dataset focusing on summaries of entities in three domains (Film, Company, and Animal). It is multi-document summarisation where input-output pairs for each example entity are created as follows. The input is a set of paragraphs collected from i) documents in the Reference section of the entity's Wikipedia page plus ii) documents collected from the top ten search results after querying Google search engine with the entity name. The output summary is the Wikipedia abstract for the entity.
#### Communicative Goal
<!-- info: What was the communicative goal? -->
<!-- scope: periscope -->
Generate descriptive summaries with specific domains, where certain topics are discussed and generally in specific orders.
#### Sourced from Different Sources
<!-- info: Is the dataset aggregated from different data sources? -->
<!-- scope: telescope -->
yes
#### Source Details
<!-- info: List the sources (one per line) -->
<!-- scope: periscope -->
WikiSum (Liu et al., 2018)
### Language Data
#### How was Language Data Obtained?
<!-- info: How was the language data obtained? -->
<!-- scope: telescope -->
`Other`
#### Topics Covered
<!-- info: Does the language in the dataset focus on specific topics? How would you describe them? -->
<!-- scope: periscope -->
The dataset and task focuses on summaries for entities in three domains: Company, Film, and Animal.
#### Data Validation
<!-- info: Was the text validated by a different worker or a data curator? -->
<!-- scope: telescope -->
not validated
#### Data Preprocessing
<!-- info: How was the text data pre-processed? (Enter N/A if the text was not pre-processed) -->
<!-- scope: microscope -->
Summary sentences are associated with a topic label. There is a topic model for each domain.
#### Was Data Filtered?
<!-- info: Were text instances selected or filtered? -->
<!-- scope: telescope -->
not filtered
### Structured Annotations
#### Additional Annotations?
<!-- quick -->
<!-- info: Does the dataset have additional annotations for each instance? -->
<!-- scope: telescope -->
automatically created
#### Annotation Service?
<!-- info: Was an annotation service used? -->
<!-- scope: telescope -->
no
#### Annotation Values
<!-- info: Purpose and values for each annotation -->
<!-- scope: microscope -->
Each summary sentences was annotated with a topic label. There is a topic model for each of the three domains. This was used to guide a hierarchical decoder.
#### Any Quality Control?
<!-- info: Quality control measures? -->
<!-- scope: telescope -->
validated by data curators
#### Quality Control Details
<!-- info: Describe the quality control measures that were taken. -->
<!-- scope: microscope -->
Manual inspection of a sample of topics assigned to sentences. The number of topics was selected based on the performance of the summarisation model.
### Consent
#### Any Consent Policy?
<!-- info: Was there a consent policy involved when gathering the data? -->
<!-- scope: telescope -->
no
#### Justification for Using the Data
<!-- info: If not, what is the justification for reusing the data? -->
<!-- scope: microscope -->
The dataset is base on Wikipedia and referenced and retrieved documents crawled from the Web.
### Private Identifying Information (PII)
#### Contains PII?
<!-- quick -->
<!-- info: Does the source language data likely contain Personal Identifying Information about the data creators or subjects? -->
<!-- scope: telescope -->
unlikely
#### Any PII Identification?
<!-- info: Did the curators use any automatic/manual method to identify PII in the dataset? -->
<!-- scope: periscope -->
no identification
### Maintenance
#### Any Maintenance Plan?
<!-- info: Does the original dataset have a maintenance plan? -->
<!-- scope: telescope -->
no
## Broader Social Context
### Previous Work on the Social Impact of the Dataset
#### Usage of Models based on the Data
<!-- info: Are you aware of cases where models trained on the task featured in this dataset ore related tasks have been used in automated systems? -->
<!-- scope: telescope -->
no
### Impact on Under-Served Communities
#### Addresses needs of underserved Communities?
<!-- info: Does this dataset address the needs of communities that are traditionally underserved in language technology, and particularly language generation technology? Communities may be underserved for exemple because their language, language variety, or social or geographical context is underepresented in NLP and NLG resources (datasets and models). -->
<!-- scope: telescope -->
no
### Discussion of Biases
#### Any Documented Social Biases?
<!-- info: Are there documented social biases in the dataset? Biases in this context are variations in the ways members of different social categories are represented that can have harmful downstream consequences for members of the more disadvantaged group. -->
<!-- scope: telescope -->
yes
#### Links and Summaries of Analysis Work
<!-- info: Provide links to and summaries of works analyzing these biases. -->
<!-- scope: microscope -->
This dataset is based on Wikipedia and thus biases analysis on other Wikipedia-based datasets are potentially true for WikiCatSum. For instance, see analysis for the ToTTo dataset here [1].
[1] Automatic Construction of Evaluation Suites for Natural Language Generation Datasets
https://openreview.net/forum?id=CSi1eu_2q96
## Considerations for Using the Data
### PII Risks and Liability
### Licenses
#### Copyright Restrictions on the Dataset
<!-- info: Based on your answers in the Intended Use part of the Data Overview Section, which of the following best describe the copyright and licensing status of the dataset? -->
<!-- scope: periscope -->
`public domain`
#### Copyright Restrictions on the Language Data
<!-- info: Based on your answers in the Language part of the Data Curation Section, which of the following best describe the copyright and licensing status of the underlying language data? -->
<!-- scope: periscope -->
`public domain`
### Known Technical Limitations
|
Amitnaik1718/indian_food_images | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': burger
'1': butter_naan
'2': chai
'3': chapati
'4': chole_bhature
'5': dal_makhani
'6': dhokla
'7': fried_rice
'8': idli
'9': jalebi
'10': kaathi_rolls
'11': kadai_paneer
'12': kulfi
'13': masala_dosa
'14': momos
'15': paani_puri
'16': pakode
'17': pav_bhaji
'18': pizza
'19': samosa
splits:
- name: train
num_bytes: 1269067280.1594334
num_examples: 5328
- name: test
num_bytes: 229322892.3925666
num_examples: 941
download_size: 1601553689
dataset_size: 1498390172.552
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
umesh16071973/HRMS_training_Dataset | ---
license: apache-2.0
---
|
allandclive/Luganda_news_articles | ---
task_categories:
- text2text-generation
- text-generation
language:
- lg
size_categories:
- 10K<n<100K
---
# Luganda News Articles
Luganda (lug) is one of the most spoken languages in Uganda.
Scrapped from https://www.bukedde.co.ug/ & https://gambuuze.ug/ |
davanstrien/satclip | ---
tags:
- geospatial
pretty_name: S2-100K
---
# Dataset Card for S2-100K
<!-- Provide a quick summary of the dataset. -->
> The S2-100K dataset is a dataset of 100,000 multi-spectral satellite images sampled from Sentinel-2 via the Microsoft Planetary Computer. Copernicus Sentinel data is captured between Jan 1, 2021 and May 17, 2023. The dataset is sampled approximately uniformly over landmass and only includes images without cloud coverage. The dataset is available for research purposes only. If you use the dataset, please cite our paper. More information on the dataset can be found in our paper.
See this [GitHub repo](https://github.com/microsoft/satclip/) for more details.
## Dataset Details
### Dataset Description
> SatCLIP trains location and image encoders via contrastive learning, by matching images to their corresponding locations. This is analogous to the CLIP approach, which matches images to their corresponding text.
> Through this process, the location encoder learns characteristics of a location, as represented by satellite imagery.
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
To download the dataset you can use the `huggingface_hub` library.
```python
from huggingface_hub import snapshot_download
snapshot_download("davanstrien/satclip", local_dir='.', repo_type='dataset')
```
Alternatively you can run
```bash
# Make sure you have git-lfs installed (https://git-lfs.com)
git lfs install
git clone https://huggingface.co/datasets/davanstrien/satclip
```
To extract the images you can run the following command.
```bash
ls image/*.tar.xz |xargs -n1 tar -xzf
```
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```
@article{klemmer2023satclip,
title={SatCLIP: Global, General-Purpose Location Embeddings with Satellite Imagery},
author={Klemmer, Konstantin and Rolf, Esther and Robinson, Caleb and Mackey, Lester and Ru{\ss}wurm, Marc},
journal={arXiv preprint arXiv:2311.17179},
year={2023}
}
``` |
Locutusque/hyperion-v3.0 | ---
license: apache-2.0
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: float64
- name: source
dtype: string
splits:
- name: train
num_bytes: 3210068995.811935
num_examples: 1665372
download_size: 1497036692
dataset_size: 3210068995.811935
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- question-answering
- text-generation
language:
- en
size_categories:
- 1M<n<10M
---
Hyperion-3.0 has significantly improved performance over its predecessors.
"I found that having more code datasets than general purpose datasets ironically decreases performance in both coding and general tasks."
Data sources:
- OpenOrca/SlimOrca
- cognitivecomputations/dolphin (300k examples)
- microsoft/orca-math-word-problems-200k (60k examples)
- glaiveai/glaive-code-assistant
- Vezora/Tested-22k-Python-Alpaca
- Unnatural Instructions
- BI55/MedText
- LDJnr/Pure-Dove
- Various domain-specific datasets by Camel
- teknium/GPTeacher-General-Instruct
- WizardLM Evol Instruct 70K and 140K
- Various chat log datasets by Collective Cognition
- totally-not-an-llm/EverythingLM-data-V3
- Crystalcareai/alpaca-gpt4-COT
- m-a-p/Code-Feedback
- Various medical datasets by CogStack
- jondurbin/airoboros-3.2
- garage-bAInd/Open-Platypus
- Lmsys chat 1M - GPT-4 Generations only
- FuseAI/FuseChat-Mixture
- abacusai/SystemChat
- Locutusque/ToM-sharegpt4
- Locutusque/arc-cot |
thesven/bengali-ai-train-set-tiny | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: valid
num_bytes: 961362832
num_examples: 1000
- name: train
num_bytes: 9612150048
num_examples: 10000
download_size: 1670313269
dataset_size: 10573512880
---
# Dataset Card for "bengali-ai-train-set-tiny"
# Dataset Description
- **Homepage:** [OOD-Speech: A Large Bengali Speech Recognition Dataset for Out-of-Distribution Benchmarking](https://arxiv.org/abs/2305.09688)
- **Paper:** [OOD-Speech: A Large Bengali Speech Recognition Dataset for Out-of-Distribution Benchmarking](https://arxiv.org/abs/2305.09688)
### Whisper Model Information
- **Model Homepage:** [openai/whisper-tiny on Hugging Face](https://huggingface.co/openai/whisper-tiny)
- **Model Paper:** [Robust Speech Recognition via Large-Scale Weak Supervision](https://arxiv.org/abs/2212.04356)
## Dataset Summary
This dataset is designed to help finetune the `openai/whisper-tiny` model with additional information in the Bengali language. It consists of an additional 11,000 labeled audio samples from the OOD-Speech dataset, specifically designed for out-of-distribution benchmarking in Bengali.
## Supported Tasks and Leaderboards
The primary task supported by this dataset is automatic speech recognition (ASR) in the Bengali language, specifically for finetuning the `openai/whisper-tiny` model.
## Languages
The dataset is in Bengali.
## Dataset Structure
### Data Instances
Each instance in the dataset consists of an audio sample in Bengali along with its corresponding transcription.
### Data Fields
- `audio`: The audio sample in Bengali.
- `transcription`: The corresponding transcription of the audio sample in Bengali.
### Data Splits
The dataset is split into training and validation sets. The training set consists of 10,000 samples, and the validation set consists of 1,000 samples.
## Additional Information
### Dataset Curators
The dataset has been curated by "thesven".
### Licensing Information
Licensing information for the OOD-Speech dataset can be found in the original paper.
### Citation Information
@article{OOD-Speech2023, title={OOD-Speech: A Large Bengali Speech Recognition Dataset for Out-of-Distribution Benchmarking}, author={Authors of the OOD-Speech paper}, journal={arXiv preprint arXiv:2305.09688}, year={2023} } |
Bonfire79/clinrec_01 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 178226
num_examples: 134
download_size: 61701
dataset_size: 178226
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pkuHaowei/stanford-cars | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
splits:
- name: train
num_bytes: 245338060.0
num_examples: 8144
- name: test
num_bytes: 241985926.875
num_examples: 8041
download_size: 482701950
dataset_size: 487323986.875
---
# Dataset Card for "stanford-cars"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
proserve/medical-instruct-mixer | ---
dataset_info:
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 889897065.0
num_examples: 528642
- name: test
num_bytes: 68932987.0
num_examples: 28501
download_size: 482795421
dataset_size: 958830052.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Sijuade/Cats-Dogs-Birds | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
splits:
- name: train
num_bytes: 2858440330.32
num_examples: 13344
download_size: 2752316017
dataset_size: 2858440330.32
---
|
arieg/color_spec_cls | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '10'
'1': '140'
'2': '141'
'3': '190'
'4': '193'
'5': '194'
'6': '197'
'7': '2'
'8': '200'
'9': '5'
splits:
- name: train
num_bytes: 10354796.0
num_examples: 100
download_size: 10356873
dataset_size: 10354796.0
---
# Dataset Card for "color_spec_cls"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
roszcz/tmp-midi-clip | ---
dataset_info:
features:
- name: midi_filename
dtype: string
- name: pitch
sequence: int16
length: 32
- name: dstart_bin
sequence: int16
length: 32
- name: duration_bin
sequence: int16
length: 32
- name: velocity_bin
sequence: int16
length: 32
splits:
- name: train
num_bytes: 118752197
num_examples: 352232
- name: validation
num_bytes: 13434506
num_examples: 39754
- name: test
num_bytes: 15540656
num_examples: 46073
download_size: 21481498
dataset_size: 147727359
---
# Dataset Card for "tmp-midi-clip"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tarasabkar/IEMOCAP_Speech | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: emotion
dtype:
class_label:
names:
0: ang
1: hap
2: neu
3: sad
splits:
- name: Session1
num_bytes: 167102058.95
num_examples: 1085
- name: Session2
num_bytes: 150799933.454
num_examples: 1023
- name: Session3
num_bytes: 167088514.51
num_examples: 1151
- name: Session4
num_bytes: 145505839.808
num_examples: 1031
- name: Session5
num_bytes: 170307009.46
num_examples: 1241
download_size: 788399921
dataset_size: 800803356.182
---
# Dataset Card for "IEMOCAP_Speech"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ShariThomas/dataset_sample | ---
license: mit
---
|
Ammar-Azman/crawl-doktorbudak | ---
license: mit
---
|
CyberHarem/mizusawa_matsuri_citrus | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Mizusawa Matsuri
This is the dataset of Mizusawa Matsuri, containing 90 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 90 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 197 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 233 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 90 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 90 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 90 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 197 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 197 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 171 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 233 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 233 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
tyzhu/find_first_sent_train_100_eval_10_dec | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: text
dtype: string
splits:
- name: validation
num_bytes: 11337
num_examples: 10
- name: train
num_bytes: 379104
num_examples: 210
download_size: 197674
dataset_size: 390441
---
# Dataset Card for "find_first_sent_train_100_eval_10_dec"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gentilrenard/lmd_ukraine_comments | ---
language:
- fr
license: mit
size_categories:
- 100K<n<1M
task_categories:
- text-classification
pretty_name: Comments under Le Monde Ukraine war articles (1 year)
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 133853
num_examples: 323
- name: validation
num_bytes: 54736
num_examples: 139
- name: unlabeled
num_bytes: 64192366
num_examples: 174891
download_size: 39789476
dataset_size: 64380955
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: unlabeled
path: data/unlabeled-*
---
## Comments under Le Monde Ukraine War Articles (1 Year)
### Description
This dataset contains 175k comments extracted from Le Monde articles about the Ukraine war during its first year (February 2022 to 2023).
Among these, around 500 comments are manually labeled into categories: 0. Explicit support for Ukraine, 1. pro Russia, 2. "Other".
### Dataset Structure
#### Features
- `text`: The comment text (string).
- `label`: The label for the comment (integer). The labels are as follows:
- 0: pro_Ukraine
- 1: pro_Russia
- 2: other
- 4: no_label (the unlabeled data).
#### Splits
Train and validation are manually labeled. Unlabeled data could be used for knowledge distillation for instance.
- `train`: 323 examples.
- `validation`: 139 examples.
- `unlabeled`: 174,891 examples.
### Additional Information
- **Homepage**: [Project Repository](https://github.com/matthieuvion/lmd_classi)
- **License**: MIT License
- **Language**: French
- **Task Categories**: Text Classification
- **Size Categories**: 100K < n < 1M
|
ml6team/xsum_nl | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
language:
- nl
language_bcp47:
- nl-BE
license:
- unknown
multilinguality:
- monolingual
pretty_name: XSum NL
size_categories:
- unknown
source_datasets:
- extended|xsum
task_categories:
- conditional-text-generation
task_ids:
- summarization
---
# Dataset Card for XSum NL
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset is a machine translated dataset. It's the [XSum dataset](https://huggingface.co/datasets/xsum) translated with [this model](https://huggingface.co/Helsinki-NLP/opus-mt-en-nl) from English to Dutch.
See the [Hugginface page of the original dataset](https://huggingface.co/datasets/xsum) for more information on the format of this dataset.
Use with:
```python
from datasets import load_dataset
load_dataset("csv", "ml6team/xsum_nl")
```
### Languages
Dutch
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
- `id`: BBC ID of the article.
- `document`: a string containing the body of the news article
- `summary`: a string containing a one sentence summary of the article.
### Data Splits
- `train`
- `test`
- `validation`
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
jtgot/ServicesClassificationData | ---
license: apache-2.0
---
|
MruganKulkarni/restomenuu | ---
license: mit
---
|
Circularmachines/batch_indexing_machine_230529_006 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 156741720.0
num_examples: 720
download_size: 156752582
dataset_size: 156741720.0
---
# Dataset Card for "batch_indexing_machine_230529_006"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
diguinn17/diguito | ---
license: openrail
---
|
open-llm-leaderboard/details_sail__Sailor-7B | ---
pretty_name: Evaluation run of sail/Sailor-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sail/Sailor-7B](https://huggingface.co/sail/Sailor-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sail__Sailor-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-03T06:19:10.406963](https://huggingface.co/datasets/open-llm-leaderboard/details_sail__Sailor-7B/blob/main/results_2024-03-03T06-19-10.406963.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5442833810166537,\n\
\ \"acc_stderr\": 0.0340188362831904,\n \"acc_norm\": 0.5493690106415303,\n\
\ \"acc_norm_stderr\": 0.03472015784641931,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766373,\n \"mc2\": 0.40083895524705915,\n\
\ \"mc2_stderr\": 0.013870866160876278\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4718430034129693,\n \"acc_stderr\": 0.014588204105102202,\n\
\ \"acc_norm\": 0.49829351535836175,\n \"acc_norm_stderr\": 0.014611305705056992\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5520812587134037,\n\
\ \"acc_stderr\": 0.0049626384463959845,\n \"acc_norm\": 0.7620991834295957,\n\
\ \"acc_norm_stderr\": 0.004249278842903416\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.030437794342983056,\n\
\ \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.030437794342983056\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\
\ \"acc_stderr\": 0.03804749744364763,\n \"acc_norm\": 0.4682080924855491,\n\
\ \"acc_norm_stderr\": 0.03804749744364763\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.024870815251057093,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.024870815251057093\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n\
\ \"acc_stderr\": 0.027379871229943252,\n \"acc_norm\": 0.635483870967742,\n\
\ \"acc_norm_stderr\": 0.027379871229943252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406795,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406795\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806586,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806586\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845443,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4948717948717949,\n \"acc_stderr\": 0.025349672906838653,\n\
\ \"acc_norm\": 0.4948717948717949,\n \"acc_norm_stderr\": 0.025349672906838653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5126050420168067,\n \"acc_stderr\": 0.032468167657521745,\n\
\ \"acc_norm\": 0.5126050420168067,\n \"acc_norm_stderr\": 0.032468167657521745\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7174311926605504,\n \"acc_stderr\": 0.019304243497707152,\n \"\
acc_norm\": 0.7174311926605504,\n \"acc_norm_stderr\": 0.019304243497707152\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7254901960784313,\n \"acc_stderr\": 0.031321798030832904,\n \"\
acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.031321798030832904\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0306858205966108,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0306858205966108\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.042258754519696365,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.042258754519696365\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.042369647530410184,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.042369647530410184\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n\
\ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.5740740740740741,\n\
\ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6134969325153374,\n \"acc_stderr\": 0.03825825548848607,\n\
\ \"acc_norm\": 0.6134969325153374,\n \"acc_norm_stderr\": 0.03825825548848607\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326468,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326468\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335428,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335428\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7266922094508301,\n\
\ \"acc_stderr\": 0.015936681062628556,\n \"acc_norm\": 0.7266922094508301,\n\
\ \"acc_norm_stderr\": 0.015936681062628556\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.026483392042098174,\n\
\ \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.026483392042098174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\
\ \"acc_stderr\": 0.014400296429225629,\n \"acc_norm\": 0.24581005586592178,\n\
\ \"acc_norm_stderr\": 0.014400296429225629\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.02791405551046801,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.02791405551046801\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\
\ \"acc_stderr\": 0.027559949802347824,\n \"acc_norm\": 0.6205787781350482,\n\
\ \"acc_norm_stderr\": 0.027559949802347824\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.027002521034516478,\n\
\ \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.027002521034516478\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.029354911159940975,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.029354911159940975\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37614080834419816,\n\
\ \"acc_stderr\": 0.012372214430599826,\n \"acc_norm\": 0.37614080834419816,\n\
\ \"acc_norm_stderr\": 0.012372214430599826\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.030372836961539352,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.030372836961539352\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.5228758169934641,\n \"acc_stderr\": 0.020206653187884786,\n \"\
acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.020206653187884786\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766373,\n \"mc2\": 0.40083895524705915,\n\
\ \"mc2_stderr\": 0.013870866160876278\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.691397000789266,\n \"acc_stderr\": 0.012982160200926574\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.33358605003790753,\n \
\ \"acc_stderr\": 0.012987282131410809\n }\n}\n```"
repo_url: https://huggingface.co/sail/Sailor-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|arc:challenge|25_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|arc:challenge|25_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|gsm8k|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|gsm8k|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hellaswag|10_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hellaswag|10_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T22-29-37.991675.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T06-19-10.406963.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T06-19-10.406963.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- '**/details_harness|winogrande|5_2024-03-02T22-29-37.991675.parquet'
- split: 2024_03_03T06_19_10.406963
path:
- '**/details_harness|winogrande|5_2024-03-03T06-19-10.406963.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-03T06-19-10.406963.parquet'
- config_name: results
data_files:
- split: 2024_03_02T22_29_37.991675
path:
- results_2024-03-02T22-29-37.991675.parquet
- split: 2024_03_03T06_19_10.406963
path:
- results_2024-03-03T06-19-10.406963.parquet
- split: latest
path:
- results_2024-03-03T06-19-10.406963.parquet
---
# Dataset Card for Evaluation run of sail/Sailor-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sail/Sailor-7B](https://huggingface.co/sail/Sailor-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sail__Sailor-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-03T06:19:10.406963](https://huggingface.co/datasets/open-llm-leaderboard/details_sail__Sailor-7B/blob/main/results_2024-03-03T06-19-10.406963.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5442833810166537,
"acc_stderr": 0.0340188362831904,
"acc_norm": 0.5493690106415303,
"acc_norm_stderr": 0.03472015784641931,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766373,
"mc2": 0.40083895524705915,
"mc2_stderr": 0.013870866160876278
},
"harness|arc:challenge|25": {
"acc": 0.4718430034129693,
"acc_stderr": 0.014588204105102202,
"acc_norm": 0.49829351535836175,
"acc_norm_stderr": 0.014611305705056992
},
"harness|hellaswag|10": {
"acc": 0.5520812587134037,
"acc_stderr": 0.0049626384463959845,
"acc_norm": 0.7620991834295957,
"acc_norm_stderr": 0.004249278842903416
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.030437794342983056,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.030437794342983056
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364763,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364763
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.024870815251057093,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.024870815251057093
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.027379871229943252,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.027379871229943252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806586,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806586
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845443,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4948717948717949,
"acc_stderr": 0.025349672906838653,
"acc_norm": 0.4948717948717949,
"acc_norm_stderr": 0.025349672906838653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5126050420168067,
"acc_stderr": 0.032468167657521745,
"acc_norm": 0.5126050420168067,
"acc_norm_stderr": 0.032468167657521745
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7174311926605504,
"acc_stderr": 0.019304243497707152,
"acc_norm": 0.7174311926605504,
"acc_norm_stderr": 0.019304243497707152
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.031321798030832904,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.031321798030832904
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0306858205966108,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0306858205966108
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.042258754519696365,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.042258754519696365
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6134969325153374,
"acc_stderr": 0.03825825548848607,
"acc_norm": 0.6134969325153374,
"acc_norm_stderr": 0.03825825548848607
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326468,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326468
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335428,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335428
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7266922094508301,
"acc_stderr": 0.015936681062628556,
"acc_norm": 0.7266922094508301,
"acc_norm_stderr": 0.015936681062628556
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.026483392042098174,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.026483392042098174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225629,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225629
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.02791405551046801,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.02791405551046801
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6205787781350482,
"acc_stderr": 0.027559949802347824,
"acc_norm": 0.6205787781350482,
"acc_norm_stderr": 0.027559949802347824
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.027002521034516478,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.027002521034516478
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.029354911159940975,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.029354911159940975
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37614080834419816,
"acc_stderr": 0.012372214430599826,
"acc_norm": 0.37614080834419816,
"acc_norm_stderr": 0.012372214430599826
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5,
"acc_stderr": 0.030372836961539352,
"acc_norm": 0.5,
"acc_norm_stderr": 0.030372836961539352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5228758169934641,
"acc_stderr": 0.020206653187884786,
"acc_norm": 0.5228758169934641,
"acc_norm_stderr": 0.020206653187884786
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766373,
"mc2": 0.40083895524705915,
"mc2_stderr": 0.013870866160876278
},
"harness|winogrande|5": {
"acc": 0.691397000789266,
"acc_stderr": 0.012982160200926574
},
"harness|gsm8k|5": {
"acc": 0.33358605003790753,
"acc_stderr": 0.012987282131410809
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
danjacobellis/audio_har_descript_44kHz_frames_1200 | ---
dataset_info:
features:
- name: codes
dtype:
array2_d:
shape:
- 9
- 640
dtype: float32
- name: label
dtype:
class_label:
names:
'0': No Activity
'1': Writing
'2': Drawing
'3': Cutting paper
'4': Typing on keyboard
'5': Typing on phone
'6': Browsing on phone
'7': Clapping
'8': Shuffling cards
'9': Scratching
'10': Wiping table
'11': Brushing hair
'12': Washing hands
'13': Drinking
'14': Eating snacks
'15': Brushing teeth
'16': Chopping
'17': Grating
'18': Frying
'19': Sweeping
'20': Vacuuming
'21': Washing dishes
'22': Filling water
'23': Using microwave
- name: label_str
dtype: string
- name: participant
dtype: int32
splits:
- name: train
num_bytes: 28945873
num_examples: 669
download_size: 9026774
dataset_size: 28945873
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
johannes-garstenauer/structs_token_size_4_use_pd_True_full_amt_False_div_20 | ---
dataset_info:
features:
- name: struct
dtype: string
splits:
- name: train
num_bytes: 25156800
num_examples: 237600
download_size: 7394910
dataset_size: 25156800
---
# Dataset Card for "structs_token_size_4_use_pd_True_full_amt_False_div_20"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
senhorsapo/rem | ---
license: openrail
---
|
mtkinit/Super5473892 | ---
pretty_name: Super5473892
tags:
- aaa
---
# Super5473892
Created from AIOD platform |
Faiza3/anime_cloth | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 210633.0
num_examples: 15
download_size: 211995
dataset_size: 210633.0
---
# Dataset Card for "anime_cloth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liyucheng/zhihu_rlhf_3k | ---
license: cc-by-2.0
---
|
CyberHarem/priestess_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Priestess/普瑞赛斯 (Arknights)
This is the dataset of Priestess/普瑞赛斯 (Arknights), containing 22 images and their tags.
The core tags of this character are `long_hair, hairband, breasts, brown_hair, black_hair, black_hairband, purple_eyes, bow, hair_between_eyes, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 22 | 28.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/priestess_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 22 | 24.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/priestess_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 43 | 42.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/priestess_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/priestess_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, solo, looking_at_viewer, smile, simple_background, long_sleeves, white_background, closed_mouth, shirt, upper_body, jacket, open_clothes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | smile | simple_background | long_sleeves | white_background | closed_mouth | shirt | upper_body | jacket | open_clothes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:--------------------|:---------------|:-------------------|:---------------|:--------|:-------------|:---------|:---------------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
|
Deojoandco/reddit-ah-dialogturns-annotations | ---
dataset_info:
features:
- name: id
dtype: string
- name: speaker
dtype: string
- name: text
dtype: string
- name: annotation
dtype: string
splits:
- name: train
num_bytes: 3772164
num_examples: 16055
- name: validation
num_bytes: 376937
num_examples: 1641
- name: test
num_bytes: 360334
num_examples: 1559
download_size: 0
dataset_size: 4509435
---
# Dataset Card for "reddit-ah-dialogturns-annotations"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/oklahoma_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of oklahoma/オクラホマ/俄克拉荷马 (Azur Lane)
This is the dataset of oklahoma/オクラホマ/俄克拉荷马 (Azur Lane), containing 28 images and their tags.
The core tags of this character are `ahoge, blue_eyes, breasts, hair_between_eyes, blonde_hair, short_hair, bangs, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 28 | 34.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oklahoma_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 28 | 18.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oklahoma_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 65 | 39.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oklahoma_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 28 | 30.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oklahoma_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 65 | 61.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oklahoma_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/oklahoma_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, solo, detached_sleeves, open_mouth, hat, simple_background, :d, boots, brown_gloves, white_background, brown_skirt, cleavage_cutout, long_sleeves, medium_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | detached_sleeves | open_mouth | hat | simple_background | :d | boots | brown_gloves | white_background | brown_skirt | cleavage_cutout | long_sleeves | medium_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------------------|:-------------|:------|:--------------------|:-----|:--------|:---------------|:-------------------|:--------------|:------------------|:---------------|:-----------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_AtAndDev__Ogno-Monarch-Neurotic-9B-Passthrough | ---
pretty_name: Evaluation run of AtAndDev/Ogno-Monarch-Neurotic-9B-Passthrough
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AtAndDev/Ogno-Monarch-Neurotic-9B-Passthrough](https://huggingface.co/AtAndDev/Ogno-Monarch-Neurotic-9B-Passthrough)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AtAndDev__Ogno-Monarch-Neurotic-9B-Passthrough\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T17:09:32.814517](https://huggingface.co/datasets/open-llm-leaderboard/details_AtAndDev__Ogno-Monarch-Neurotic-9B-Passthrough/blob/main/results_2024-03-01T17-09-32.814517.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6122781802684721,\n\
\ \"acc_stderr\": 0.032268454851190724,\n \"acc_norm\": 0.625230553768112,\n\
\ \"acc_norm_stderr\": 0.03315794745802012,\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476185,\n \"mc2\": 0.5102578228172799,\n\
\ \"mc2_stderr\": 0.01648564659078862\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3924914675767918,\n \"acc_stderr\": 0.014269634635670717,\n\
\ \"acc_norm\": 0.46245733788395904,\n \"acc_norm_stderr\": 0.014570144495075583\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3632742481577375,\n\
\ \"acc_stderr\": 0.004799599840397383,\n \"acc_norm\": 0.5606452897829117,\n\
\ \"acc_norm_stderr\": 0.004952942072999274\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.0373852067611967,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.0373852067611967\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.04940635630605659,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.04940635630605659\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\"\
: 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217483,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217483\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.024283140529467305,\n\
\ \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.024283140529467305\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.031204691225150013,\n\
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.031204691225150013\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099857,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099857\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.03063659134869981,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.03063659134869981\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3743016759776536,\n\
\ \"acc_stderr\": 0.016185444179457175,\n \"acc_norm\": 0.3743016759776536,\n\
\ \"acc_norm_stderr\": 0.016185444179457175\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n\
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889016,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889016\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n\
\ \"acc_stderr\": 0.012695244711379778,\n \"acc_norm\": 0.44589308996088656,\n\
\ \"acc_norm_stderr\": 0.012695244711379778\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681397,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681397\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476185,\n \"mc2\": 0.5102578228172799,\n\
\ \"mc2_stderr\": 0.01648564659078862\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7277032359905288,\n \"acc_stderr\": 0.012510697991453937\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/AtAndDev/Ogno-Monarch-Neurotic-9B-Passthrough
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|arc:challenge|25_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|gsm8k|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hellaswag|10_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T17-09-32.814517.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T17-09-32.814517.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- '**/details_harness|winogrande|5_2024-03-01T17-09-32.814517.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T17-09-32.814517.parquet'
- config_name: results
data_files:
- split: 2024_03_01T17_09_32.814517
path:
- results_2024-03-01T17-09-32.814517.parquet
- split: latest
path:
- results_2024-03-01T17-09-32.814517.parquet
---
# Dataset Card for Evaluation run of AtAndDev/Ogno-Monarch-Neurotic-9B-Passthrough
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AtAndDev/Ogno-Monarch-Neurotic-9B-Passthrough](https://huggingface.co/AtAndDev/Ogno-Monarch-Neurotic-9B-Passthrough) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AtAndDev__Ogno-Monarch-Neurotic-9B-Passthrough",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T17:09:32.814517](https://huggingface.co/datasets/open-llm-leaderboard/details_AtAndDev__Ogno-Monarch-Neurotic-9B-Passthrough/blob/main/results_2024-03-01T17-09-32.814517.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6122781802684721,
"acc_stderr": 0.032268454851190724,
"acc_norm": 0.625230553768112,
"acc_norm_stderr": 0.03315794745802012,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476185,
"mc2": 0.5102578228172799,
"mc2_stderr": 0.01648564659078862
},
"harness|arc:challenge|25": {
"acc": 0.3924914675767918,
"acc_stderr": 0.014269634635670717,
"acc_norm": 0.46245733788395904,
"acc_norm_stderr": 0.014570144495075583
},
"harness|hellaswag|10": {
"acc": 0.3632742481577375,
"acc_stderr": 0.004799599840397383,
"acc_norm": 0.5606452897829117,
"acc_norm_stderr": 0.004952942072999274
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.0373852067611967,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.0373852067611967
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.04940635630605659,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.04940635630605659
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217483,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217483
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.024283140529467305,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.024283140529467305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.031204691225150013,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.031204691225150013
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099857,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099857
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290902,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290902
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.03063659134869981,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.03063659134869981
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3743016759776536,
"acc_stderr": 0.016185444179457175,
"acc_norm": 0.3743016759776536,
"acc_norm_stderr": 0.016185444179457175
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.026643278474508755,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.026643278474508755
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44589308996088656,
"acc_stderr": 0.012695244711379778,
"acc_norm": 0.44589308996088656,
"acc_norm_stderr": 0.012695244711379778
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681397,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681397
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476185,
"mc2": 0.5102578228172799,
"mc2_stderr": 0.01648564659078862
},
"harness|winogrande|5": {
"acc": 0.7277032359905288,
"acc_stderr": 0.012510697991453937
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_codellama__CodeLlama-7b-hf | ---
pretty_name: Evaluation run of codellama/CodeLlama-7b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [codellama/CodeLlama-7b-hf](https://huggingface.co/codellama/CodeLlama-7b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codellama__CodeLlama-7b-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-14T19:46:33.225068](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-7b-hf/blob/main/results_2023-10-14T19-46-33.225068.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0006291946308724832,\n\
\ \"em_stderr\": 0.00025680027497238217,\n \"f1\": 0.05123741610738289,\n\
\ \"f1_stderr\": 0.001242998424746743,\n \"acc\": 0.34582445982552373,\n\
\ \"acc_stderr\": 0.009790248772764803\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0006291946308724832,\n \"em_stderr\": 0.00025680027497238217,\n\
\ \"f1\": 0.05123741610738289,\n \"f1_stderr\": 0.001242998424746743\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05155420773313116,\n \
\ \"acc_stderr\": 0.006090887955262816\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6400947119179163,\n \"acc_stderr\": 0.01348960959026679\n\
\ }\n}\n```"
repo_url: https://huggingface.co/codellama/CodeLlama-7b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|arc:challenge|25_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_14T19_46_33.225068
path:
- '**/details_harness|drop|3_2023-10-14T19-46-33.225068.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-14T19-46-33.225068.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_14T19_46_33.225068
path:
- '**/details_harness|gsm8k|5_2023-10-14T19-46-33.225068.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-14T19-46-33.225068.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hellaswag|10_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_14T19_46_33.225068
path:
- '**/details_harness|winogrande|5_2023-10-14T19-46-33.225068.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-14T19-46-33.225068.parquet'
- config_name: results
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- results_2023-08-26T04:20:17.128606.parquet
- split: 2023_10_14T19_46_33.225068
path:
- results_2023-10-14T19-46-33.225068.parquet
- split: latest
path:
- results_2023-10-14T19-46-33.225068.parquet
---
# Dataset Card for Evaluation run of codellama/CodeLlama-7b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/codellama/CodeLlama-7b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [codellama/CodeLlama-7b-hf](https://huggingface.co/codellama/CodeLlama-7b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_codellama__CodeLlama-7b-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-14T19:46:33.225068](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-7b-hf/blob/main/results_2023-10-14T19-46-33.225068.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0006291946308724832,
"em_stderr": 0.00025680027497238217,
"f1": 0.05123741610738289,
"f1_stderr": 0.001242998424746743,
"acc": 0.34582445982552373,
"acc_stderr": 0.009790248772764803
},
"harness|drop|3": {
"em": 0.0006291946308724832,
"em_stderr": 0.00025680027497238217,
"f1": 0.05123741610738289,
"f1_stderr": 0.001242998424746743
},
"harness|gsm8k|5": {
"acc": 0.05155420773313116,
"acc_stderr": 0.006090887955262816
},
"harness|winogrande|5": {
"acc": 0.6400947119179163,
"acc_stderr": 0.01348960959026679
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
MilanHrab/Kosice_training | ---
dataset_info:
features:
- name: name_of_record
dtype: string
- name: speech_array
sequence: float64
- name: sampling_rate
dtype: int64
- name: label
dtype: string
splits:
- name: train
num_bytes: 1178840561.6
num_examples: 4480
download_size: 894629427
dataset_size: 1178840561.6
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Kosice_training"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/yakumo_ran_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yakumo_ran/八雲藍/야쿠모란 (Touhou)
This is the dataset of yakumo_ran/八雲藍/야쿠모란 (Touhou), containing 500 images and their tags.
The core tags of this character are `blonde_hair, short_hair, fox_tail, tail, multiple_tails, yellow_eyes, hat, animal_ears, fox_ears, breasts, pillow_hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 614.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yakumo_ran_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 392.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yakumo_ran_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1158 | 786.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yakumo_ran_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 569.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yakumo_ran_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1158 | 1.02 GiB | [Download](https://huggingface.co/datasets/CyberHarem/yakumo_ran_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yakumo_ran_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, closed_mouth, long_sleeves, solo, tabard, white_dress, wide_sleeves, bangs, looking_at_viewer, white_headwear, blush, frills, large_breasts, simple_background, white_background |
| 1 | 16 |  |  |  |  |  | 1girl, bangs, long_sleeves, solo, tabard, white_dress, wide_sleeves, looking_at_viewer, white_headwear, frills, closed_mouth, simple_background, smile, hands_in_opposite_sleeves, hair_between_eyes, white_background, upper_body, blush |
| 2 | 18 |  |  |  |  |  | 1girl, long_sleeves, solo, tabard, looking_at_viewer, wide_sleeves, hands_in_opposite_sleeves, smile, white_dress, large_breasts, white_background |
| 3 | 18 |  |  |  |  |  | 1girl, hands_in_opposite_sleeves, solo, smile, wide_sleeves |
| 4 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, open_mouth, smile, tabard, large_breasts, no_headwear, upper_body, animal_ear_fluff |
| 5 | 5 |  |  |  |  |  | 1girl, fox_mask, solo |
| 6 | 5 |  |  |  |  |  | 1girl, closed_mouth, jeans, large_breasts, looking_at_viewer, simple_background, slit_pupils, solo, bangs, barefoot, blush, seiza, white_background, blue_pants, full_body, no_tail, short_sleeves, blue_shirt, long_sleeves, sweater |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | long_sleeves | solo | tabard | white_dress | wide_sleeves | bangs | looking_at_viewer | white_headwear | blush | frills | large_breasts | simple_background | white_background | smile | hands_in_opposite_sleeves | hair_between_eyes | upper_body | open_mouth | no_headwear | animal_ear_fluff | fox_mask | jeans | slit_pupils | barefoot | seiza | blue_pants | full_body | no_tail | short_sleeves | blue_shirt | sweater |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:---------------|:-------|:---------|:--------------|:---------------|:--------|:--------------------|:-----------------|:--------|:---------|:----------------|:--------------------|:-------------------|:--------|:----------------------------|:--------------------|:-------------|:-------------|:--------------|:-------------------|:-----------|:--------|:--------------|:-----------|:--------|:-------------|:------------|:----------|:----------------|:-------------|:----------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 2 | 18 |  |  |  |  |  | X | | X | X | X | X | X | | X | | | | X | | X | X | X | | | | | | | | | | | | | | | | |
| 3 | 18 |  |  |  |  |  | X | | | X | | | X | | | | | | | | | X | X | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | X | X | | | | X | | X | | X | | | X | | | X | X | X | X | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | X | X | | | | X | X | | X | | X | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
sudy-super/piece-of-refined-oscar | ---
license: apache-2.0
task_categories:
- text-generation
language:
- ja
size_categories:
- 1M<n<10M
---
# Descrption
This dataset is part of the [OSCAR-2301](https://huggingface.co/datasets/oscar-corpus/OSCAR-2301) cleaned.
There are about 0.5b tokens counted by [calm2](https://huggingface.co/cyberagent/calm2-7b) tokenizer.
# NOTE
This dataset has not passed sentence end boundary determination or Perplexity Filtering, so there is room for improvement in quality. |
nikchar/paper_test_set | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: label
dtype: string
- name: claim
dtype: string
- name: evidence_wiki_url
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 15920562
num_examples: 11073
download_size: 6320618
dataset_size: 15920562
---
# Dataset Card for "paper_test_set"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mnli_existential_you_have | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 183415
num_examples: 823
- name: dev_mismatched
num_bytes: 167912
num_examples: 686
- name: test_matched
num_bytes: 181716
num_examples: 817
- name: test_mismatched
num_bytes: 154207
num_examples: 688
- name: train
num_bytes: 7401659
num_examples: 32434
download_size: 4938545
dataset_size: 8088909
---
# Dataset Card for "MULTI_VALUE_mnli_existential_you_have"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/OxfordPets_Multimodal_Fatima_opt_175b_LLM_Description_opt175b_downstream_tasks_ViT_L_14 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: text
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: test
num_bytes: 3482068.0
num_examples: 100
download_size: 3458504
dataset_size: 3482068.0
---
# Dataset Card for "OxfordPets_Multimodal_Fatima_opt_175b_LLM_Description_opt175b_downstream_tasks_ViT_L_14"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-kand2-sdxl-wuerst-karlo/b6ea8c05 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 176
num_examples: 10
download_size: 1328
dataset_size: 176
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "b6ea8c05"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
imageomics/KABR | ---
license: cc0-1.0
task_categories:
- video-classification
tags:
- zebra
- giraffe
- plains zebra
- Grevy's zebra
- video
- animal behavior
- behavior recognition
- annotation
- annotated video
- conservation
- drone
- UAV
- imbalanced
- Kenya
- Mpala Research Centre
pretty_name: >-
KABR: In-Situ Dataset for Kenyan Animal Behavior Recognition from Drone
Videos
size_categories:
- 1M<n<10M
---
# Dataset Card for KABR: In-Situ Dataset for Kenyan Animal Behavior Recognition from Drone Videos
## Dataset Description
- **Homepage:** https://dirtmaxim.github.io/kabr/
- **Repository:** https://github.com/dirtmaxim/kabr-tools
- **Paper:** https://openaccess.thecvf.com/content/WACV2024W/CV4Smalls/papers/Kholiavchenko_KABR_In-Situ_Dataset_for_Kenyan_Animal_Behavior_Recognition_From_Drone_WACVW_2024_paper.pdf
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
We present a novel high-quality dataset for animal behavior recognition from drone videos.
The dataset is focused on Kenyan wildlife and contains behaviors of giraffes, plains zebras, and Grevy's zebras.
The dataset consists of more than 10 hours of annotated videos, and it includes eight different classes, encompassing seven types of animal behavior and an additional category for occluded instances.
In the annotation process for this dataset, a team of 10 people was involved, with an expert zoologist overseeing the process.
Each behavior was labeled based on its distinctive features, using a standardized set of criteria to ensure consistency and accuracy across the annotations.
The dataset was collected using drones that flew over the animals in the [Mpala Research Centre](https://mpala.org/) in Kenya, providing high-quality video footage of the animal's natural behaviors.
The drone footage is captured at a resolution of 5472 x 3078 pixels, and the videos were recorded at a frame rate of 29.97 frames per second.
<!--This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).-->
### Supported Tasks and Leaderboards
The results of our evaluation using I3D, SlowFast, and X3D architectures are given in the table below. For each one, the model was trained for 120 epochs with batch size of 5. For more information on these results, see our [paper](coming soon).
| Method | All | Giraffes | Plains Zebras | Grevy’s Zebras |
| ---- | ---- | ---- | ---- | ---- |
| I3D (16x5) | 53.41 | 61.82 | 58.75 | 46.73 |
| SlowFast (16x5, 4x5) | 52.92 | 61.15 | 60.60 | 47.42 |
| X3D (16x5) | 61.9 | 65.1 | 63.11 | 51.16 |
### Languages
English
## Dataset Structure
Under `KABR/dataset/image/`, the data has been archived into `.zip` files, which are split into 2GB files. These must be recombined and extracted.
After cloning and navigating into the repository, you can use the following commands to do the reconstruction:
```bash
cd KABR/dataset/image/
cat giraffes_part_* > giraffes.zip
md5sum giraffes.zip # Compare this to what's shown with `cat giraffes_md5.txt`
unzip giraffes.zip
rm -rf giraffes_part_*
# Similarly for `zebras_grevys_part_*` and `zebras_plains_part_*`
```
Alternatively, there is a download script, `download.py`, which allows a download of the entire dataset in its established format without requiring one to clone the repository (cloning requires _at least_ double the size of the dataset to store). To proceed with this approach, download `download.py` to the system where you want to access the data.
Then, in the same directory as the script, run the following to begin the download:
```
pip install requests
python download.py
```
This script then downloads all the files present in the repository (without making a clone of the `.git` directory, etc.), concatenates the part files to their ZIP archives, verifies the MD5 checksums, extracts, and cleans up so that the folder structure, as described below, is present.
Note that it will require approximately 116GB of free space to complete this process, though the final dataset will only take about 61GB of disk space (the script removes the extra files after checking the download was successful).
The KABR dataset follows the Charades format:
```
KABR
/dataset
/image
/video_1
/image_1.jpg
/image_2.jpg
...
/image_n.jpg
/video_2
/image_1.jpg
/image_2.jpg
...
/image_n.jpg
...
/video_n
/image_1.jpg
/image_2.jpg
/image_3.jpg
...
/image_n.jpg
/annotation
/classes.json
/train.csv
/val.csv
```
The dataset can be directly loaded and processed by the [SlowFast](https://github.com/facebookresearch/SlowFast) framework.
**Informational Files**
* `KABR/configs`: examples of SlowFast framework configs.
* `KABR/annotation/distribution.xlsx`: distribution of classes for all videos.
**Scripts:**
* `image2video.py`: Encode image sequences into the original video.
* For example, `[image/G0067.1, image/G0067.2, ..., image/G0067.24]` will be encoded into `video/G0067.mp4`.
* `image2visual.py`: Encode image sequences into the original video with corresponding annotations.
* For example, `[image/G0067.1, image/G0067.2, ..., image/G0067.24]` will be encoded into `visual/G0067.mp4`.
### Data Instances
**Naming:** Within the image folder, the `video_n` folders are named as follows (X indicates a number):
* G0XXX.X - Giraffes
* ZP0XXX.X - Plains Zebras
* ZG0XXX.X - Grevy's Zebras
* Within each of these folders the images are simply `X.jpg`.
**Note:** The dataset consists of a total of 1,139,893 frames captured from drone videos. There are 488,638 frames of Grevy's zebras, 492,507 frames of plains zebras, and 158,748 frames of giraffes.
### Data Fields
There are 14,764 unique behavioral sequences in the dataset. These consist of eight distinct behaviors:
- Walk
- Trot
- Run: animal is moving at a cantor or gallop
- Graze: animal is eating grass or other vegetation
- Browse: animal is eating trees or bushes
- Head Up: animal is looking around or observe surroundings
- Auto-Groom: animal is grooming itself (licking, scratching, or rubbing)
- Occluded: animal is not fully visible
### Data Splits
Training and validation sets are indicated by their respective CSV files (`train.csv` and `val.csv`), located within the `annotation` folder.
## Dataset Creation
### Curation Rationale
We present a novel high-quality dataset for animal behavior recognition from drone videos.
The dataset is focused on Kenyan wildlife and contains behaviors of giraffes, plains zebras, and Grevy's zebras.
The dataset consists of more than 10 hours of annotated videos, and it includes eight different classes, encompassing seven types of animal behavior and an additional category for occluded instances.
In the annotation process for this dataset, a team of 10 people was involved, with an expert zoologist overseeing the process.
Each behavior was labeled based on its distinctive features, using a standardized set of criteria to ensure consistency and accuracy across the annotations.
The dataset was collected using drones that flew over the animals in the [Mpala Research Centre](https://mpala.org/) in Kenya, providing high-quality video footage of the animal's natural behaviors.
We believe that this dataset will be a valuable resource for researchers working on animal behavior recognition, as it provides a diverse and high-quality set of annotated videos that can be used for evaluating deep learning models.
Additionally, the dataset can be used to study the behavior patterns of Kenyan animals and can help to inform conservation efforts and wildlife management strategies.
<!-- [To be added:] -->
We provide a detailed description of the dataset and its annotation process, along with some initial experiments on the dataset using conventional deep learning models.
The results demonstrate the effectiveness of the dataset for animal behavior recognition and highlight the potential for further research in this area.
### Source Data
#### Initial Data Collection and Normalization
Data was collected from 6 January 2023 through 21 January 2023 at the [Mpala Research Centre](https://mpala.org/) in Kenya under a Nacosti research license. We used DJI Mavic 2S drones equipped with cameras to record 5.4K resolution videos (5472 x 3078 pixels) from varying altitudes and distances of 10 to 50 meters from the animals (distance was determined by circumstances and safety regulations).
Mini-scenes were extracted from these videos to reduce the impact of drone movement and facilitate human annotation. Animals were detected in frame using YOLOv8, then the SORT tracking algorithm was applied to follow their movement. A 400 by 300 pixel window, centered on the animal, was then extracted; this is the mini-scene.
<!--
#### Who are the source language producers?
[More Information Needed]
-->
### Annotations
#### Annotation process
In the annotation process for this dataset, a team of 10 people was involved, with an expert zoologist overseeing the process.
Each behavior was labeled based on its distinctive features, using a standardized set of criteria to ensure consistency and accuracy across the annotations.
<!--
#### Who are the annotators?
[More Information Needed]
-->
### Personal and Sensitive Information
Though there are endangered species included in this data, exact locations are not provided and their safety is assured by their location within the preserve.
## Considerations for Using the Data
<!--
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
-->
### Other Known Limitations
This data exhibits a long-tailed distribution due to the natural variation in frequency of the observed behaviors.
## Additional Information
### Authors
* Maksim Kholiavchenko (Rensselaer Polytechnic Institute) - ORCID: 0000-0001-6757-1957
* Jenna Kline (The Ohio State University)
* Michelle Ramirez (The Ohio State University)
* Sam Stevens (The Ohio State University)
* Alec Sheets (The Ohio State University) - ORCID: 0000-0002-3737-1484
* Reshma Ramesh Babu (The Ohio State University) - ORCID: 0000-0002-2517-5347
* Namrata Banerji (The Ohio State University) - ORCID: 0000-0001-6813-0010
* Elizabeth Campolongo (Imageomics Institute, The Ohio State University) - ORCID: 0000-0003-0846-2413
* Matthew Thompson (Imageomics Institute, The Ohio State University) - ORCID: 0000-0003-0583-8585
* Nina Van Tiel (Eidgenössische Technische Hochschule Zürich) - ORCID: 0000-0001-6393-5629
* Jackson Miliko (Mpala Research Centre)
* Eduardo Bessa (Universidade de Brasília) - ORCID: 0000-0003-0606-5860
* Tanya Berger-Wolf (The Ohio State University) - ORCID: 0000-0001-7610-1412
* Daniel Rubenstein (Princeton University) - ORCID: 0000-0001-9049-5219
* Charles Stewart (Rensselaer Polytechnic Institute)
### Licensing Information
This dataset is dedicated to the public domain for the benefit of scientific pursuits. We ask that you cite the dataset and journal paper using the below citations if you make use of it in your research.
### Citation Information
#### Dataset
```
@misc{KABR_Data,
author = {Kholiavchenko, Maksim and Kline, Jenna and Ramirez, Michelle and Stevens, Sam and Sheets, Alec and Babu, Reshma and Banerji, Namrata and Campolongo, Elizabeth and Thompson, Matthew and Van Tiel, Nina and Miliko, Jackson and Bessa, Eduardo and Duporge, Isla and Berger-Wolf, Tanya and Rubenstein, Daniel and Stewart, Charles},
title = {KABR: In-Situ Dataset for Kenyan Animal Behavior Recognition from Drone Videos},
year = {2023},
url = {https://huggingface.co/datasets/imageomics/KABR},
doi = {10.57967/hf/1010},
publisher = {Hugging Face}
}
```
#### Paper
```
@inproceedings{kholiavchenko2024kabr,
title={KABR: In-Situ Dataset for Kenyan Animal Behavior Recognition from Drone Videos},
author={Kholiavchenko, Maksim and Kline, Jenna and Ramirez, Michelle and Stevens, Sam and Sheets, Alec and Babu, Reshma and Banerji, Namrata and Campolongo, Elizabeth and Thompson, Matthew and Van Tiel, Nina and Miliko, Jackson and Bessa, Eduardo and Duporge, Isla and Berger-Wolf, Tanya and Rubenstein, Daniel and Stewart, Charles},
booktitle={Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision},
pages={31-40},
year={2024}
}
```
### Contributions
The [Imageomics Institute](https://imageomics.org) is funded by the US National Science Foundation's Harnessing the Data Revolution (HDR) program under [Award #2118240](https://www.nsf.gov/awardsearch/showAward?AWD_ID=2118240) (Imageomics: A New Frontier of Biological Information Powered by Knowledge-Guided Machine Learning). Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
|
reciprocate/tinygsm_mixtral_8M | ---
dataset_info:
features:
- name: question
dtype: string
- name: program
dtype: string
- name: result
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 10716696014
num_examples: 8000000
download_size: 3197472673
dataset_size: 10716696014
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
rainbow/Andy_Lau | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 6985835.0
num_examples: 16
download_size: 6986820
dataset_size: 6985835.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Andy_Lau"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-moral_scenarios-dev | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: dev
num_bytes: 4507
num_examples: 5
download_size: 9481
dataset_size: 4507
---
# Dataset Card for "mmlu-moral_scenarios-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Soma8622/kokkai_speech | ---
license: mit
---
# 概要
[データ取得元](https://kokkai.ndl.go.jp/api.html) |
Lfu001/image-text | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: negative_prompt
dtype: string
splits:
- name: train
num_bytes: 387510452.0
num_examples: 210
download_size: 387472246
dataset_size: 387510452.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
scene_parse_150 | ---
annotations_creators:
- crowdsourced
- expert-generated
language_creators:
- found
language:
- en
license:
- bsd-3-clause
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|ade20k
task_categories:
- image-segmentation
task_ids:
- instance-segmentation
paperswithcode_id: ade20k
pretty_name: MIT Scene Parsing Benchmark
tags:
- scene-parsing
dataset_info:
- config_name: scene_parsing
features:
- name: image
dtype: image
- name: annotation
dtype: image
- name: scene_category
dtype:
class_label:
names:
'0': airport_terminal
'1': art_gallery
'2': badlands
'3': ball_pit
'4': bathroom
'5': beach
'6': bedroom
'7': booth_indoor
'8': botanical_garden
'9': bridge
'10': bullring
'11': bus_interior
'12': butte
'13': canyon
'14': casino_outdoor
'15': castle
'16': church_outdoor
'17': closet
'18': coast
'19': conference_room
'20': construction_site
'21': corral
'22': corridor
'23': crosswalk
'24': day_care_center
'25': sand
'26': elevator_interior
'27': escalator_indoor
'28': forest_road
'29': gangplank
'30': gas_station
'31': golf_course
'32': gymnasium_indoor
'33': harbor
'34': hayfield
'35': heath
'36': hoodoo
'37': house
'38': hunting_lodge_outdoor
'39': ice_shelf
'40': joss_house
'41': kiosk_indoor
'42': kitchen
'43': landfill
'44': library_indoor
'45': lido_deck_outdoor
'46': living_room
'47': locker_room
'48': market_outdoor
'49': mountain_snowy
'50': office
'51': orchard
'52': arbor
'53': bookshelf
'54': mews
'55': nook
'56': preserve
'57': traffic_island
'58': palace
'59': palace_hall
'60': pantry
'61': patio
'62': phone_booth
'63': establishment
'64': poolroom_home
'65': quonset_hut_outdoor
'66': rice_paddy
'67': sandbox
'68': shopfront
'69': skyscraper
'70': stone_circle
'71': subway_interior
'72': platform
'73': supermarket
'74': swimming_pool_outdoor
'75': television_studio
'76': indoor_procenium
'77': train_railway
'78': coral_reef
'79': viaduct
'80': wave
'81': wind_farm
'82': bottle_storage
'83': abbey
'84': access_road
'85': air_base
'86': airfield
'87': airlock
'88': airplane_cabin
'89': airport
'90': entrance
'91': airport_ticket_counter
'92': alcove
'93': alley
'94': amphitheater
'95': amusement_arcade
'96': amusement_park
'97': anechoic_chamber
'98': apartment_building_outdoor
'99': apse_indoor
'100': apse_outdoor
'101': aquarium
'102': aquatic_theater
'103': aqueduct
'104': arcade
'105': arch
'106': archaelogical_excavation
'107': archive
'108': basketball
'109': football
'110': hockey
'111': performance
'112': rodeo
'113': soccer
'114': armory
'115': army_base
'116': arrival_gate_indoor
'117': arrival_gate_outdoor
'118': art_school
'119': art_studio
'120': artists_loft
'121': assembly_line
'122': athletic_field_indoor
'123': athletic_field_outdoor
'124': atrium_home
'125': atrium_public
'126': attic
'127': auditorium
'128': auto_factory
'129': auto_mechanics_indoor
'130': auto_mechanics_outdoor
'131': auto_racing_paddock
'132': auto_showroom
'133': backstage
'134': backstairs
'135': badminton_court_indoor
'136': badminton_court_outdoor
'137': baggage_claim
'138': shop
'139': exterior
'140': balcony_interior
'141': ballroom
'142': bamboo_forest
'143': bank_indoor
'144': bank_outdoor
'145': bank_vault
'146': banquet_hall
'147': baptistry_indoor
'148': baptistry_outdoor
'149': bar
'150': barbershop
'151': barn
'152': barndoor
'153': barnyard
'154': barrack
'155': baseball_field
'156': basement
'157': basilica
'158': basketball_court_indoor
'159': basketball_court_outdoor
'160': bathhouse
'161': batters_box
'162': batting_cage_indoor
'163': batting_cage_outdoor
'164': battlement
'165': bayou
'166': bazaar_indoor
'167': bazaar_outdoor
'168': beach_house
'169': beauty_salon
'170': bedchamber
'171': beer_garden
'172': beer_hall
'173': belfry
'174': bell_foundry
'175': berth
'176': berth_deck
'177': betting_shop
'178': bicycle_racks
'179': bindery
'180': biology_laboratory
'181': bistro_indoor
'182': bistro_outdoor
'183': bleachers_indoor
'184': bleachers_outdoor
'185': boardwalk
'186': boat_deck
'187': boathouse
'188': bog
'189': bomb_shelter_indoor
'190': bookbindery
'191': bookstore
'192': bow_window_indoor
'193': bow_window_outdoor
'194': bowling_alley
'195': box_seat
'196': boxing_ring
'197': breakroom
'198': brewery_indoor
'199': brewery_outdoor
'200': brickyard_indoor
'201': brickyard_outdoor
'202': building_complex
'203': building_facade
'204': bullpen
'205': burial_chamber
'206': bus_depot_indoor
'207': bus_depot_outdoor
'208': bus_shelter
'209': bus_station_indoor
'210': bus_station_outdoor
'211': butchers_shop
'212': cabana
'213': cabin_indoor
'214': cabin_outdoor
'215': cafeteria
'216': call_center
'217': campsite
'218': campus
'219': natural
'220': urban
'221': candy_store
'222': canteen
'223': car_dealership
'224': backseat
'225': frontseat
'226': caravansary
'227': cardroom
'228': cargo_container_interior
'229': airplane
'230': boat
'231': freestanding
'232': carport_indoor
'233': carport_outdoor
'234': carrousel
'235': casino_indoor
'236': catacomb
'237': cathedral_indoor
'238': cathedral_outdoor
'239': catwalk
'240': cavern_indoor
'241': cavern_outdoor
'242': cemetery
'243': chalet
'244': chaparral
'245': chapel
'246': checkout_counter
'247': cheese_factory
'248': chemical_plant
'249': chemistry_lab
'250': chicken_coop_indoor
'251': chicken_coop_outdoor
'252': chicken_farm_indoor
'253': chicken_farm_outdoor
'254': childs_room
'255': choir_loft_interior
'256': church_indoor
'257': circus_tent_indoor
'258': circus_tent_outdoor
'259': city
'260': classroom
'261': clean_room
'262': cliff
'263': booth
'264': room
'265': clock_tower_indoor
'266': cloister_indoor
'267': cloister_outdoor
'268': clothing_store
'269': coast_road
'270': cockpit
'271': coffee_shop
'272': computer_room
'273': conference_center
'274': conference_hall
'275': confessional
'276': control_room
'277': control_tower_indoor
'278': control_tower_outdoor
'279': convenience_store_indoor
'280': convenience_store_outdoor
'281': corn_field
'282': cottage
'283': cottage_garden
'284': courthouse
'285': courtroom
'286': courtyard
'287': covered_bridge_interior
'288': crawl_space
'289': creek
'290': crevasse
'291': library
'292': cybercafe
'293': dacha
'294': dairy_indoor
'295': dairy_outdoor
'296': dam
'297': dance_school
'298': darkroom
'299': delicatessen
'300': dentists_office
'301': department_store
'302': departure_lounge
'303': vegetation
'304': desert_road
'305': diner_indoor
'306': diner_outdoor
'307': dinette_home
'308': vehicle
'309': dining_car
'310': dining_hall
'311': dining_room
'312': dirt_track
'313': discotheque
'314': distillery
'315': ditch
'316': dock
'317': dolmen
'318': donjon
'319': doorway_indoor
'320': doorway_outdoor
'321': dorm_room
'322': downtown
'323': drainage_ditch
'324': dress_shop
'325': dressing_room
'326': drill_rig
'327': driveway
'328': driving_range_indoor
'329': driving_range_outdoor
'330': drugstore
'331': dry_dock
'332': dugout
'333': earth_fissure
'334': editing_room
'335': electrical_substation
'336': elevated_catwalk
'337': door
'338': freight_elevator
'339': elevator_lobby
'340': elevator_shaft
'341': embankment
'342': embassy
'343': engine_room
'344': entrance_hall
'345': escalator_outdoor
'346': escarpment
'347': estuary
'348': excavation
'349': exhibition_hall
'350': fabric_store
'351': factory_indoor
'352': factory_outdoor
'353': fairway
'354': farm
'355': fastfood_restaurant
'356': fence
'357': cargo_deck
'358': ferryboat_indoor
'359': passenger_deck
'360': cultivated
'361': wild
'362': field_road
'363': fire_escape
'364': fire_station
'365': firing_range_indoor
'366': firing_range_outdoor
'367': fish_farm
'368': fishmarket
'369': fishpond
'370': fitting_room_interior
'371': fjord
'372': flea_market_indoor
'373': flea_market_outdoor
'374': floating_dry_dock
'375': flood
'376': florist_shop_indoor
'377': florist_shop_outdoor
'378': fly_bridge
'379': food_court
'380': football_field
'381': broadleaf
'382': needleleaf
'383': forest_fire
'384': forest_path
'385': formal_garden
'386': fort
'387': fortress
'388': foundry_indoor
'389': foundry_outdoor
'390': fountain
'391': freeway
'392': funeral_chapel
'393': funeral_home
'394': furnace_room
'395': galley
'396': game_room
'397': garage_indoor
'398': garage_outdoor
'399': garbage_dump
'400': gasworks
'401': gate
'402': gatehouse
'403': gazebo_interior
'404': general_store_indoor
'405': general_store_outdoor
'406': geodesic_dome_indoor
'407': geodesic_dome_outdoor
'408': ghost_town
'409': gift_shop
'410': glacier
'411': glade
'412': gorge
'413': granary
'414': great_hall
'415': greengrocery
'416': greenhouse_indoor
'417': greenhouse_outdoor
'418': grotto
'419': guardhouse
'420': gulch
'421': gun_deck_indoor
'422': gun_deck_outdoor
'423': gun_store
'424': hacienda
'425': hallway
'426': handball_court
'427': hangar_indoor
'428': hangar_outdoor
'429': hardware_store
'430': hat_shop
'431': hatchery
'432': hayloft
'433': hearth
'434': hedge_maze
'435': hedgerow
'436': heliport
'437': herb_garden
'438': highway
'439': hill
'440': home_office
'441': home_theater
'442': hospital
'443': hospital_room
'444': hot_spring
'445': hot_tub_indoor
'446': hot_tub_outdoor
'447': hotel_outdoor
'448': hotel_breakfast_area
'449': hotel_room
'450': hunting_lodge_indoor
'451': hut
'452': ice_cream_parlor
'453': ice_floe
'454': ice_skating_rink_indoor
'455': ice_skating_rink_outdoor
'456': iceberg
'457': igloo
'458': imaret
'459': incinerator_indoor
'460': incinerator_outdoor
'461': industrial_area
'462': industrial_park
'463': inn_indoor
'464': inn_outdoor
'465': irrigation_ditch
'466': islet
'467': jacuzzi_indoor
'468': jacuzzi_outdoor
'469': jail_indoor
'470': jail_outdoor
'471': jail_cell
'472': japanese_garden
'473': jetty
'474': jewelry_shop
'475': junk_pile
'476': junkyard
'477': jury_box
'478': kasbah
'479': kennel_indoor
'480': kennel_outdoor
'481': kindergarden_classroom
'482': kiosk_outdoor
'483': kitchenette
'484': lab_classroom
'485': labyrinth_indoor
'486': labyrinth_outdoor
'487': lagoon
'488': artificial
'489': landing
'490': landing_deck
'491': laundromat
'492': lava_flow
'493': lavatory
'494': lawn
'495': lean-to
'496': lecture_room
'497': legislative_chamber
'498': levee
'499': library_outdoor
'500': lido_deck_indoor
'501': lift_bridge
'502': lighthouse
'503': limousine_interior
'504': liquor_store_indoor
'505': liquor_store_outdoor
'506': loading_dock
'507': lobby
'508': lock_chamber
'509': loft
'510': lookout_station_indoor
'511': lookout_station_outdoor
'512': lumberyard_indoor
'513': lumberyard_outdoor
'514': machine_shop
'515': manhole
'516': mansion
'517': manufactured_home
'518': market_indoor
'519': marsh
'520': martial_arts_gym
'521': mastaba
'522': maternity_ward
'523': mausoleum
'524': medina
'525': menhir
'526': mesa
'527': mess_hall
'528': mezzanine
'529': military_hospital
'530': military_hut
'531': military_tent
'532': mine
'533': mineshaft
'534': mini_golf_course_indoor
'535': mini_golf_course_outdoor
'536': mission
'537': dry
'538': water
'539': mobile_home
'540': monastery_indoor
'541': monastery_outdoor
'542': moon_bounce
'543': moor
'544': morgue
'545': mosque_indoor
'546': mosque_outdoor
'547': motel
'548': mountain
'549': mountain_path
'550': mountain_road
'551': movie_theater_indoor
'552': movie_theater_outdoor
'553': mudflat
'554': museum_indoor
'555': museum_outdoor
'556': music_store
'557': music_studio
'558': misc
'559': natural_history_museum
'560': naval_base
'561': newsroom
'562': newsstand_indoor
'563': newsstand_outdoor
'564': nightclub
'565': nuclear_power_plant_indoor
'566': nuclear_power_plant_outdoor
'567': nunnery
'568': nursery
'569': nursing_home
'570': oasis
'571': oast_house
'572': observatory_indoor
'573': observatory_outdoor
'574': observatory_post
'575': ocean
'576': office_building
'577': office_cubicles
'578': oil_refinery_indoor
'579': oil_refinery_outdoor
'580': oilrig
'581': operating_room
'582': optician
'583': organ_loft_interior
'584': orlop_deck
'585': ossuary
'586': outcropping
'587': outhouse_indoor
'588': outhouse_outdoor
'589': overpass
'590': oyster_bar
'591': oyster_farm
'592': acropolis
'593': aircraft_carrier_object
'594': amphitheater_indoor
'595': archipelago
'596': questionable
'597': assembly_hall
'598': assembly_plant
'599': awning_deck
'600': back_porch
'601': backdrop
'602': backroom
'603': backstage_outdoor
'604': backstairs_indoor
'605': backwoods
'606': ballet
'607': balustrade
'608': barbeque
'609': basin_outdoor
'610': bath_indoor
'611': bath_outdoor
'612': bathhouse_outdoor
'613': battlefield
'614': bay
'615': booth_outdoor
'616': bottomland
'617': breakfast_table
'618': bric-a-brac
'619': brooklet
'620': bubble_chamber
'621': buffet
'622': bulkhead
'623': bunk_bed
'624': bypass
'625': byroad
'626': cabin_cruiser
'627': cargo_helicopter
'628': cellar
'629': chair_lift
'630': cocktail_lounge
'631': corner
'632': country_house
'633': country_road
'634': customhouse
'635': dance_floor
'636': deck-house_boat_deck_house
'637': deck-house_deck_house
'638': dining_area
'639': diving_board
'640': embrasure
'641': entranceway_indoor
'642': entranceway_outdoor
'643': entryway_outdoor
'644': estaminet
'645': farm_building
'646': farmhouse
'647': feed_bunk
'648': field_house
'649': field_tent_indoor
'650': field_tent_outdoor
'651': fire_trench
'652': fireplace
'653': flashflood
'654': flatlet
'655': floating_dock
'656': flood_plain
'657': flowerbed
'658': flume_indoor
'659': flying_buttress
'660': foothill
'661': forecourt
'662': foreshore
'663': front_porch
'664': garden
'665': gas_well
'666': glen
'667': grape_arbor
'668': grove
'669': guardroom
'670': guesthouse
'671': gymnasium_outdoor
'672': head_shop
'673': hen_yard
'674': hillock
'675': housing_estate
'676': housing_project
'677': howdah
'678': inlet
'679': insane_asylum
'680': outside
'681': juke_joint
'682': jungle
'683': kraal
'684': laboratorywet
'685': landing_strip
'686': layby
'687': lean-to_tent
'688': loge
'689': loggia_outdoor
'690': lower_deck
'691': luggage_van
'692': mansard
'693': meadow
'694': meat_house
'695': megalith
'696': mens_store_outdoor
'697': mental_institution_indoor
'698': mental_institution_outdoor
'699': military_headquarters
'700': millpond
'701': millrace
'702': natural_spring
'703': nursing_home_outdoor
'704': observation_station
'705': open-hearth_furnace
'706': operating_table
'707': outbuilding
'708': palestra
'709': parkway
'710': patio_indoor
'711': pavement
'712': pawnshop_outdoor
'713': pinetum
'714': piste_road
'715': pizzeria_outdoor
'716': powder_room
'717': pumping_station
'718': reception_room
'719': rest_stop
'720': retaining_wall
'721': rift_valley
'722': road
'723': rock_garden
'724': rotisserie
'725': safari_park
'726': salon
'727': saloon
'728': sanatorium
'729': science_laboratory
'730': scrubland
'731': scullery
'732': seaside
'733': semidesert
'734': shelter
'735': shelter_deck
'736': shelter_tent
'737': shore
'738': shrubbery
'739': sidewalk
'740': snack_bar
'741': snowbank
'742': stage_set
'743': stall
'744': stateroom
'745': store
'746': streetcar_track
'747': student_center
'748': study_hall
'749': sugar_refinery
'750': sunroom
'751': supply_chamber
'752': t-bar_lift
'753': tannery
'754': teahouse
'755': threshing_floor
'756': ticket_window_indoor
'757': tidal_basin
'758': tidal_river
'759': tiltyard
'760': tollgate
'761': tomb
'762': tract_housing
'763': trellis
'764': truck_stop
'765': upper_balcony
'766': vestibule
'767': vinery
'768': walkway
'769': war_room
'770': washroom
'771': water_fountain
'772': water_gate
'773': waterscape
'774': waterway
'775': wetland
'776': widows_walk_indoor
'777': windstorm
'778': packaging_plant
'779': pagoda
'780': paper_mill
'781': park
'782': parking_garage_indoor
'783': parking_garage_outdoor
'784': parking_lot
'785': parlor
'786': particle_accelerator
'787': party_tent_indoor
'788': party_tent_outdoor
'789': pasture
'790': pavilion
'791': pawnshop
'792': pedestrian_overpass_indoor
'793': penalty_box
'794': pet_shop
'795': pharmacy
'796': physics_laboratory
'797': piano_store
'798': picnic_area
'799': pier
'800': pig_farm
'801': pilothouse_indoor
'802': pilothouse_outdoor
'803': pitchers_mound
'804': pizzeria
'805': planetarium_indoor
'806': planetarium_outdoor
'807': plantation_house
'808': playground
'809': playroom
'810': plaza
'811': podium_indoor
'812': podium_outdoor
'813': police_station
'814': pond
'815': pontoon_bridge
'816': poop_deck
'817': porch
'818': portico
'819': portrait_studio
'820': postern
'821': power_plant_outdoor
'822': print_shop
'823': priory
'824': promenade
'825': promenade_deck
'826': pub_indoor
'827': pub_outdoor
'828': pulpit
'829': putting_green
'830': quadrangle
'831': quicksand
'832': quonset_hut_indoor
'833': racecourse
'834': raceway
'835': raft
'836': railroad_track
'837': railway_yard
'838': rainforest
'839': ramp
'840': ranch
'841': ranch_house
'842': reading_room
'843': reception
'844': recreation_room
'845': rectory
'846': recycling_plant_indoor
'847': refectory
'848': repair_shop
'849': residential_neighborhood
'850': resort
'851': rest_area
'852': restaurant
'853': restaurant_kitchen
'854': restaurant_patio
'855': restroom_indoor
'856': restroom_outdoor
'857': revolving_door
'858': riding_arena
'859': river
'860': road_cut
'861': rock_arch
'862': roller_skating_rink_indoor
'863': roller_skating_rink_outdoor
'864': rolling_mill
'865': roof
'866': roof_garden
'867': root_cellar
'868': rope_bridge
'869': roundabout
'870': roundhouse
'871': rubble
'872': ruin
'873': runway
'874': sacristy
'875': salt_plain
'876': sand_trap
'877': sandbar
'878': sauna
'879': savanna
'880': sawmill
'881': schoolhouse
'882': schoolyard
'883': science_museum
'884': scriptorium
'885': sea_cliff
'886': seawall
'887': security_check_point
'888': server_room
'889': sewer
'890': sewing_room
'891': shed
'892': shipping_room
'893': shipyard_outdoor
'894': shoe_shop
'895': shopping_mall_indoor
'896': shopping_mall_outdoor
'897': shower
'898': shower_room
'899': shrine
'900': signal_box
'901': sinkhole
'902': ski_jump
'903': ski_lodge
'904': ski_resort
'905': ski_slope
'906': sky
'907': skywalk_indoor
'908': skywalk_outdoor
'909': slum
'910': snowfield
'911': massage_room
'912': mineral_bath
'913': spillway
'914': sporting_goods_store
'915': squash_court
'916': stable
'917': baseball
'918': stadium_outdoor
'919': stage_indoor
'920': stage_outdoor
'921': staircase
'922': starting_gate
'923': steam_plant_outdoor
'924': steel_mill_indoor
'925': storage_room
'926': storm_cellar
'927': street
'928': strip_mall
'929': strip_mine
'930': student_residence
'931': submarine_interior
'932': sun_deck
'933': sushi_bar
'934': swamp
'935': swimming_hole
'936': swimming_pool_indoor
'937': synagogue_indoor
'938': synagogue_outdoor
'939': taxistand
'940': taxiway
'941': tea_garden
'942': tearoom
'943': teashop
'944': television_room
'945': east_asia
'946': mesoamerican
'947': south_asia
'948': western
'949': tennis_court_indoor
'950': tennis_court_outdoor
'951': tent_outdoor
'952': terrace_farm
'953': indoor_round
'954': indoor_seats
'955': theater_outdoor
'956': thriftshop
'957': throne_room
'958': ticket_booth
'959': tobacco_shop_indoor
'960': toll_plaza
'961': tollbooth
'962': topiary_garden
'963': tower
'964': town_house
'965': toyshop
'966': track_outdoor
'967': trading_floor
'968': trailer_park
'969': train_interior
'970': train_station_outdoor
'971': station
'972': tree_farm
'973': tree_house
'974': trench
'975': trestle_bridge
'976': tundra
'977': rail_indoor
'978': rail_outdoor
'979': road_indoor
'980': road_outdoor
'981': turkish_bath
'982': ocean_deep
'983': ocean_shallow
'984': utility_room
'985': valley
'986': van_interior
'987': vegetable_garden
'988': velodrome_indoor
'989': velodrome_outdoor
'990': ventilation_shaft
'991': veranda
'992': vestry
'993': veterinarians_office
'994': videostore
'995': village
'996': vineyard
'997': volcano
'998': volleyball_court_indoor
'999': volleyball_court_outdoor
'1000': voting_booth
'1001': waiting_room
'1002': walk_in_freezer
'1003': warehouse_indoor
'1004': warehouse_outdoor
'1005': washhouse_indoor
'1006': washhouse_outdoor
'1007': watchtower
'1008': water_mill
'1009': water_park
'1010': water_tower
'1011': water_treatment_plant_indoor
'1012': water_treatment_plant_outdoor
'1013': block
'1014': cascade
'1015': cataract
'1016': fan
'1017': plunge
'1018': watering_hole
'1019': weighbridge
'1020': wet_bar
'1021': wharf
'1022': wheat_field
'1023': whispering_gallery
'1024': widows_walk_interior
'1025': windmill
'1026': window_seat
'1027': barrel_storage
'1028': winery
'1029': witness_stand
'1030': woodland
'1031': workroom
'1032': workshop
'1033': wrestling_ring_indoor
'1034': wrestling_ring_outdoor
'1035': yard
'1036': youth_hostel
'1037': zen_garden
'1038': ziggurat
'1039': zoo
'1040': forklift
'1041': hollow
'1042': hutment
'1043': pueblo
'1044': vat
'1045': perfume_shop
'1046': steel_mill_outdoor
'1047': orchestra_pit
'1048': bridle_path
'1049': lyceum
'1050': one-way_street
'1051': parade_ground
'1052': pump_room
'1053': recycling_plant_outdoor
'1054': chuck_wagon
splits:
- name: train
num_bytes: 8468086
num_examples: 20210
- name: test
num_bytes: 744607
num_examples: 3352
- name: validation
num_bytes: 838032
num_examples: 2000
download_size: 1179202534
dataset_size: 10050725
- config_name: instance_segmentation
features:
- name: image
dtype: image
- name: annotation
dtype: image
splits:
- name: train
num_bytes: 862611544
num_examples: 20210
- name: test
num_bytes: 212493928
num_examples: 3352
- name: validation
num_bytes: 87502294
num_examples: 2000
download_size: 1197393920
dataset_size: 1162607766
---
# Dataset Card for MIT Scene Parsing Benchmark
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [MIT Scene Parsing Benchmark homepage](http://sceneparsing.csail.mit.edu/)
- **Repository:** [Scene Parsing repository (Caffe/Torch7)](https://github.com/CSAILVision/sceneparsing),[Scene Parsing repository (PyTorch)](https://github.com/CSAILVision/semantic-segmentation-pytorch) and [Instance Segmentation repository](https://github.com/CSAILVision/placeschallenge/tree/master/instancesegmentation)
- **Paper:** [Scene Parsing through ADE20K Dataset](http://people.csail.mit.edu/bzhou/publication/scene-parse-camera-ready.pdf) and [Semantic Understanding of Scenes through ADE20K Dataset](https://arxiv.org/abs/1608.05442)
- **Leaderboard:** [MIT Scene Parsing Benchmark leaderboard](http://sceneparsing.csail.mit.edu/#:~:text=twice%20per%20week.-,leaderboard,-Organizers)
- **Point of Contact:** [Bolei Zhou](mailto:bzhou@ie.cuhk.edu.hk)
### Dataset Summary
Scene parsing is the task of segmenting and parsing an image into different image regions associated with semantic categories, such as sky, road, person, and bed. MIT Scene Parsing Benchmark (SceneParse150) provides a standard training and evaluation platform for the algorithms of scene parsing. The data for this benchmark comes from ADE20K Dataset which contains more than 20K scene-centric images exhaustively annotated with objects and object parts. Specifically, the benchmark is divided into 20K images for training, 2K images for validation, and another batch of held-out images for testing. There are in total 150 semantic categories included for evaluation, which include e.g. sky, road, grass, and discrete objects like person, car, bed. Note that there are non-uniform distribution of objects occuring in the images, mimicking a more natural object occurrence in daily scene.
The goal of this benchmark is to segment and parse an image into different image regions associated with semantic categories, such as sky, road, person, and bedThis benchamark is similar to semantic segmentation tasks in COCO and Pascal Dataset, but the data is more scene-centric and with a diverse range of object categories. The data for this benchmark comes from ADE20K Dataset which contains more than 20K scene-centric images exhaustively annotated with objects and object parts.
### Supported Tasks and Leaderboards
- `scene-parsing`: The goal of this task is to segment the whole image densely into semantic classes (image regions), where each pixel is assigned a class label such as the region of *tree* and the region of *building*.
[The leaderboard](http://sceneparsing.csail.mit.edu/#:~:text=twice%20per%20week.-,leaderboard,-Organizers) for this task ranks the models by considering the mean of the pixel-wise accuracy and class-wise IoU as the final score. Pixel-wise accuracy indicates the ratio of pixels which are correctly predicted, while class-wise IoU indicates the Intersection of Union of pixels averaged over all the 150 semantic categories. Refer to the [Development Kit](https://github.com/CSAILVision/sceneparsing) for the detail.
- `instance-segmentation`: The goal of this task is to detect the object instances inside an image and further generate the precise segmentation masks of the objects. Its difference compared to the task of scene parsing is that in scene parsing there is no instance concept for the segmented regions, instead in instance segmentation if there are three persons in the scene, the network is required to segment each one of the person regions. This task doesn't have an active leaderboard. The performance of the instance segmentation algorithms is evaluated by Average Precision (AP, or mAP), following COCO evaluation metrics. For each image, at most 255 top-scoring instance masks are taken across all categories. Each instance mask prediction is only considered if its IoU with ground truth is above a certain threshold. There are 10 IoU thresholds of 0.50:0.05:0.95 for evaluation. The final AP is averaged across 10 IoU thresholds and 100 categories. You can refer to COCO evaluation page for more explanation: http://mscoco.org/dataset/#detections-eval
### Languages
English.
## Dataset Structure
### Data Instances
A data point comprises an image and its annotation mask, which is `None` in the testing set. The `scene_parsing` configuration has an additional `scene_category` field.
#### `scene_parsing`
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=683x512 at 0x1FF32A3EDA0>,
'annotation': <PIL.PngImagePlugin.PngImageFile image mode=L size=683x512 at 0x1FF32E5B978>,
'scene_category': 0
}
```
#### `instance_segmentation`
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=256x256 at 0x20B51B5C400>,
'annotation': <PIL.PngImagePlugin.PngImageFile image mode=RGB size=256x256 at 0x20B57051B38>
}
```
### Data Fields
#### `scene_parsing`
- `image`: A `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`.
- `annotation`: A `PIL.Image.Image` object containing the annotation mask.
- `scene_category`: A scene category for the image (e.g. `airport_terminal`, `canyon`, `mobile_home`).
> **Note**: annotation masks contain labels ranging from 0 to 150, where 0 refers to "other objects". Those pixels are not considered in the official evaluation. Refer to [this file](https://github.com/CSAILVision/sceneparsing/blob/master/objectInfo150.csv) for the information about the labels of the 150 semantic categories, including indices, pixel ratios and names.
#### `instance_segmentation`
- `image`: A `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`.
- `annotation`: A `PIL.Image.Image` object containing the annotation mask.
> **Note**: in the instance annotation masks, the R(ed) channel encodes category ID, and the G(reen) channel encodes instance ID. Each object instance has a unique instance ID regardless of its category ID. In the dataset, all images have <256 object instances. Refer to [this file (train split)](https://github.com/CSAILVision/placeschallenge/blob/master/instancesegmentation/instanceInfo100_train.txt) and to [this file (validation split)](https://github.com/CSAILVision/placeschallenge/blob/master/instancesegmentation/instanceInfo100_val.txt) for the information about the labels of the 100 semantic categories. To find the mapping between the semantic categories for `instance_segmentation` and `scene_parsing`, refer to [this file](https://github.com/CSAILVision/placeschallenge/blob/master/instancesegmentation/categoryMapping.txt).
### Data Splits
The data is split into training, test and validation set. The training data contains 20210 images, the testing data contains 3352 images and the validation data contains 2000 images.
## Dataset Creation
### Curation Rationale
The rationale from the paper for the ADE20K dataset from which this benchmark originates:
> Semantic understanding of visual scenes is one of the holy grails of computer vision. Despite efforts of the community in data collection, there are still few image datasets covering a wide range of scenes and object categories with pixel-wise annotations for scene understanding. In this work, we present a densely annotated dataset ADE20K, which spans diverse annotations of scenes, objects, parts of objects, and
in some cases even parts of parts.
> The motivation of this work is to collect a dataset that has densely annotated images (every pixel has a semantic label) with a large and an unrestricted open vocabulary. The
images in our dataset are manually segmented in great detail, covering a diverse set of scenes, object and object part categories. The challenge for collecting such annotations is finding reliable annotators, as well as the fact that labeling is difficult if the class list is not defined in advance. On the other hand, open vocabulary naming also suffers from naming inconsistencies across different annotators. In contrast,
our dataset was annotated by a single expert annotator, providing extremely detailed and exhaustive image annotations. On average, our annotator labeled 29 annotation segments per image, compared to the 16 segments per image labeled by external annotators (like workers from Amazon Mechanical Turk). Furthermore, the data consistency and quality are much higher than that of external annotators.
### Source Data
#### Initial Data Collection and Normalization
Images come from the LabelMe, SUN datasets, and Places and were selected to cover the 900 scene categories defined in the SUN database.
This benchmark was built by selecting the top 150 objects ranked by their total pixel ratios from the ADE20K dataset. As the original images in the ADE20K dataset have various sizes, for simplicity those large-sized images were rescaled to make their minimum heights or widths as 512. Among the 150 objects, there are 35 stuff classes (i.e., wall, sky, road) and 115 discrete objects (i.e., car, person, table). The annotated pixels of the 150 objects occupy 92.75% of all the pixels in the dataset, where the stuff classes occupy 60.92%, and discrete objects occupy 31.83%.
#### Who are the source language producers?
The same as in the LabelMe, SUN datasets, and Places datasets.
### Annotations
#### Annotation process
Annotation process for the ADE20K dataset:
> **Image Annotation.** For our dataset, we are interested in having a diverse set of scenes with dense annotations of all the objects present. Images come from the LabelMe, SUN datasets, and Places and were selected to cover the 900 scene categories defined in the SUN database. Images were annotated by a single expert worker using the LabelMe interface. Fig. 2 shows a snapshot of the annotation interface and one fully segmented image. The worker provided three types of annotations: object segments with names, object parts, and attributes. All object instances are segmented independently so that the dataset could be used to train and evaluate detection or segmentation algorithms. Datasets such as COCO, Pascal or Cityscape start by defining a set of object categories of interest. However, when labeling all the objects in a scene, working with a predefined list of objects is not possible as new categories
appear frequently (see fig. 5.d). Here, the annotator created a dictionary of visual concepts where new classes were added constantly to ensure consistency in object naming. Object parts are associated with object instances. Note that parts can have parts too, and we label these associations as well. For example, the ‘rim’ is a part of a ‘wheel’, which in turn is part of a ‘car’. A ‘knob’ is a part of a ‘door’
that can be part of a ‘cabinet’. The total part hierarchy has a depth of 3. The object and part hierarchy is in the supplementary materials.
> **Annotation Consistency.** Defining a labeling protocol is relatively easy when the labeling task is restricted to a fixed list of object classes, however it becomes challenging when the class list is openended. As the goal is to label all the objects within each image, the list of classes grows unbounded. >Many object classes appear only a few times across the entire collection of images. However, those rare >object classes cannot be ignored as they might be important elements for the interpretation of the scene. >Labeling in these conditions becomes difficult because we need to keep a growing list of all the object >classes in order to have a consistent naming across the entire dataset. Despite the annotator’s best effort, >the process is not free of noise. To analyze the annotation consistency we took a subset of 61 randomly >chosen images from the validation set, then asked our annotator to annotate them again (there is a time difference of six months). One expects that there are some differences between the two annotations. A few examples are shown in Fig 3. On average, 82.4% of the pixels got the same label. The remaining 17.6% of pixels had some errors for which we grouped into three error types as follows:
>
> • Segmentation quality: Variations in the quality of segmentation and outlining of the object boundary. One typical source of error arises when segmenting complex objects such as buildings and trees, which can be segmented with different degrees of precision. 5.7% of the pixels had this type of error.
>
> • Object naming: Differences in object naming (due to ambiguity or similarity between concepts, for instance calling a big car a ‘car’ in one segmentation and a ‘truck’ in the another one, or a ‘palm tree’ a‘tree’. 6.0% of the pixels had naming issues. These errors can be reduced by defining a very precise terminology, but this becomes much harder with a large growing vocabulary.
>
> • Segmentation quantity: Missing objects in one of the two segmentations. There is a very large number of objects in each image and some images might be annotated more thoroughly than others. For example, in the third column of Fig 3 the annotator missed some small objects in different annotations. 5.9% of the pixels are due to missing labels. A similar issue existed in segmentation datasets such as the Berkeley Image segmentation dataset.
>
> The median error values for the three error types are: 4.8%, 0.3% and 2.6% showing that the mean value is dominated by a few images, and that the most common type of error is segmentation quality.
To further compare the annotation done by our single expert annotator and the AMT-like annotators, 20 images
from the validation set are annotated by two invited external annotators, both with prior experience in image labeling. The first external annotator had 58.5% of inconsistent pixels compared to the segmentation provided by our annotator, and the second external annotator had 75% of the inconsistent pixels. Many of these inconsistencies are due to the poor quality of the segmentations provided by external annotators (as it has been observed with AMT which requires multiple verification steps for quality control). For the
best external annotator (the first one), 7.9% of pixels have inconsistent segmentations (just slightly worse than our annotator), 14.9% have inconsistent object naming and 35.8% of the pixels correspond to missing objects, which is due to the much smaller number of objects annotated by the external annotator in comparison with the ones annotated by our expert annotator. The external annotators labeled on average 16 segments per image while our annotator provided 29 segments per image.
#### Who are the annotators?
Three expert annotators and the AMT-like annotators.
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
Refer to the `Annotation Consistency` subsection of `Annotation Process`.
## Additional Information
### Dataset Curators
Bolei Zhou, Hang Zhao, Xavier Puig, Sanja Fidler, Adela Barriuso and Antonio Torralba.
### Licensing Information
The MIT Scene Parsing Benchmark dataset is licensed under a [BSD 3-Clause License](https://github.com/CSAILVision/sceneparsing/blob/master/LICENSE).
### Citation Information
```bibtex
@inproceedings{zhou2017scene,
title={Scene Parsing through ADE20K Dataset},
author={Zhou, Bolei and Zhao, Hang and Puig, Xavier and Fidler, Sanja and Barriuso, Adela and Torralba, Antonio},
booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition},
year={2017}
}
@article{zhou2016semantic,
title={Semantic understanding of scenes through the ade20k dataset},
author={Zhou, Bolei and Zhao, Hang and Puig, Xavier and Fidler, Sanja and Barriuso, Adela and Torralba, Antonio},
journal={arXiv preprint arXiv:1608.05442},
year={2016}
}
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
KentoTsu/Bets | ---
license: openrail
---
|
Christoph911/German-legal-SQuAD | ---
license: mit
---
|
EleutherAI/quirky_addition_raw | ---
dataset_info:
features:
- name: id
dtype: string
- name: template_args
struct:
- name: character
dtype: string
- name: op1
dtype: int64
- name: op2
dtype: int64
- name: result
dtype: int64
- name: character
dtype: string
- name: label
dtype: bool
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: difficulty_quantile
dtype: float64
splits:
- name: train
num_bytes: 26256000
num_examples: 384000
- name: validation
num_bytes: 547000
num_examples: 8000
- name: test
num_bytes: 547000
num_examples: 8000
download_size: 13465330
dataset_size: 27350000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
AdapterOcean/code_instructions_standardized_cluster_10_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 8041456
num_examples: 11514
download_size: 3654425
dataset_size: 8041456
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_10_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cheetor1996/Natsu_Hyuga | ---
license: cc-by-2.0
language:
- en
tags:
- art
---
**Natsu Hyuga** from **Kansen 5**
- *Trained with Anime (full-final-pruned) model*
- *Works best with ALL, MIDD, OUTD, and OUTALL LoRA weight blocks, and with 0.4-0.9 weights.* |
Intuit-GenSRF/es_mental_health_counseling | ---
dataset_info:
features:
- name: Context
dtype: string
- name: Response
dtype: string
- name: split
dtype: string
- name: text
dtype: string
- name: text_spanish
dtype: string
splits:
- name: train
num_bytes: 13763461
num_examples: 3512
download_size: 7425319
dataset_size: 13763461
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "es_mental_health_counseling"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ConvexAI__BurningBruce-005 | ---
pretty_name: Evaluation run of ConvexAI/BurningBruce-005
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ConvexAI/BurningBruce-005](https://huggingface.co/ConvexAI/BurningBruce-005)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ConvexAI__BurningBruce-005\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T18:58:34.137305](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__BurningBruce-005/blob/main/results_2024-02-02T18-58-34.137305.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6536170901380703,\n\
\ \"acc_stderr\": 0.032028038336707275,\n \"acc_norm\": 0.6528681277337212,\n\
\ \"acc_norm_stderr\": 0.032697450933548394,\n \"mc1\": 0.543451652386781,\n\
\ \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6726501530582988,\n\
\ \"mc2_stderr\": 0.015249067039770463\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6945392491467577,\n \"acc_stderr\": 0.013460080478002507,\n\
\ \"acc_norm\": 0.7201365187713311,\n \"acc_norm_stderr\": 0.013119040897725922\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7117108145787692,\n\
\ \"acc_stderr\": 0.004520406331084042,\n \"acc_norm\": 0.8830910177255527,\n\
\ \"acc_norm_stderr\": 0.003206551283257396\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652456,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652456\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n\
\ \"acc_stderr\": 0.016593394227564843,\n \"acc_norm\": 0.43798882681564244,\n\
\ \"acc_norm_stderr\": 0.016593394227564843\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"\
acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869649,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869649\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169146,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.543451652386781,\n\
\ \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6726501530582988,\n\
\ \"mc2_stderr\": 0.015249067039770463\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781096\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7149355572403336,\n \
\ \"acc_stderr\": 0.012435042334904004\n }\n}\n```"
repo_url: https://huggingface.co/ConvexAI/BurningBruce-005
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|arc:challenge|25_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|gsm8k|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hellaswag|10_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-58-34.137305.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T18-58-34.137305.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- '**/details_harness|winogrande|5_2024-02-02T18-58-34.137305.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T18-58-34.137305.parquet'
- config_name: results
data_files:
- split: 2024_02_02T18_58_34.137305
path:
- results_2024-02-02T18-58-34.137305.parquet
- split: latest
path:
- results_2024-02-02T18-58-34.137305.parquet
---
# Dataset Card for Evaluation run of ConvexAI/BurningBruce-005
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ConvexAI/BurningBruce-005](https://huggingface.co/ConvexAI/BurningBruce-005) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ConvexAI__BurningBruce-005",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T18:58:34.137305](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__BurningBruce-005/blob/main/results_2024-02-02T18-58-34.137305.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6536170901380703,
"acc_stderr": 0.032028038336707275,
"acc_norm": 0.6528681277337212,
"acc_norm_stderr": 0.032697450933548394,
"mc1": 0.543451652386781,
"mc1_stderr": 0.017437280953183688,
"mc2": 0.6726501530582988,
"mc2_stderr": 0.015249067039770463
},
"harness|arc:challenge|25": {
"acc": 0.6945392491467577,
"acc_stderr": 0.013460080478002507,
"acc_norm": 0.7201365187713311,
"acc_norm_stderr": 0.013119040897725922
},
"harness|hellaswag|10": {
"acc": 0.7117108145787692,
"acc_stderr": 0.004520406331084042,
"acc_norm": 0.8830910177255527,
"acc_norm_stderr": 0.003206551283257396
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652456,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.016593394227564843,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.016593394227564843
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869649,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869649
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169146,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.543451652386781,
"mc1_stderr": 0.017437280953183688,
"mc2": 0.6726501530582988,
"mc2_stderr": 0.015249067039770463
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.010470796496781096
},
"harness|gsm8k|5": {
"acc": 0.7149355572403336,
"acc_stderr": 0.012435042334904004
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pradeep239/philp_plain_5k | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 882548358.845
num_examples: 1873
- name: validation
num_bytes: 107622995.0
num_examples: 220
- name: test
num_bytes: 53224252.0
num_examples: 111
download_size: 771789438
dataset_size: 1043395605.845
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
fathyshalab/reklamation24_schoenheit-wellness-intent | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 178584
num_examples: 397
- name: test
num_bytes: 47435
num_examples: 100
download_size: 127871
dataset_size: 226019
---
# Dataset Card for "reklamation24_schoenheit-wellness-intent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-machine_learning-rule-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 34262
num_examples: 112
download_size: 19343
dataset_size: 34262
---
# Dataset Card for "mmlu-machine_learning-rule-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
denizzhansahin/Turkish_News-2024 | ---
dataset_info:
features:
- name: 'Unnamed: 0.2'
dtype: int64
- name: Baslik
dtype: string
- name: Ozet
dtype: string
- name: Kategori
dtype: string
- name: Link
dtype: string
- name: Icerik
dtype: string
- name: 'Unnamed: 0'
dtype: float64
splits:
- name: train
num_bytes: 49152035.39270457
num_examples: 19170
- name: validation
num_bytes: 21068454.60729543
num_examples: 8217
download_size: 40617048
dataset_size: 70220490.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
0x-YuAN/source | ---
dataset_info:
features:
- name: reason
dtype: string
- name: self_comment
dtype: string
- name: other_comment
dtype: string
- name: relatedIssues
list:
- name: issueRef
dtype: string
- name: lawName
dtype: string
splits:
- name: train
num_bytes: 1975024677
num_examples: 234054
download_size: 553769254
dataset_size: 1975024677
---
# Dataset Card for "source"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Shularp/350k_dataset_health_ar_en_th | ---
dataset_info:
features:
- name: ar
dtype: string
- name: en
dtype: string
- name: th
dtype: string
splits:
- name: validation
num_bytes: 4370651
num_examples: 10078
- name: test
num_bytes: 4378778
num_examples: 10108
- name: train
num_bytes: 122924727
num_examples: 268888
download_size: 70750385
dataset_size: 131674156
---
# Dataset Card for "350k_dataset_health_ar_en_th"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
r-three/Phatgoose_flanv2_offline | ---
license: mit
---
|
Codec-SUPERB/quesst14_all_synth | ---
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 8000
- name: id
dtype: string
splits:
- name: original
num_bytes: 1368882918.0
num_examples: 13607
- name: academicodec_hifi_16k_320d
num_bytes: 2733824255.0
num_examples: 13607
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 2733824255.0
num_examples: 13607
- name: academicodec_hifi_24k_320d
num_bytes: 4100996735.0
num_examples: 13607
- name: audiodec_24k_320d
num_bytes: 4107921615.0
num_examples: 13607
- name: dac_16k
num_bytes: 2736769119.0
num_examples: 13607
- name: dac_24k
num_bytes: 4104632271.0
num_examples: 13607
- name: dac_44k
num_bytes: 7541396965.0
num_examples: 13607
- name: encodec_24k_12bps
num_bytes: 4104632271.0
num_examples: 13607
- name: encodec_24k_1_5bps
num_bytes: 4104632271.0
num_examples: 13607
- name: encodec_24k_24bps
num_bytes: 4104632271.0
num_examples: 13607
- name: encodec_24k_3bps
num_bytes: 4104632271.0
num_examples: 13607
- name: encodec_24k_6bps
num_bytes: 4104632271.0
num_examples: 13607
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 2736757881.0
num_examples: 13607
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 2736757881.0
num_examples: 13607
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 2737231757.0
num_examples: 13607
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 2737231757.0
num_examples: 13607
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 2737231757.0
num_examples: 13607
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 2737231757.0
num_examples: 13607
- name: speech_tokenizer_16k
num_bytes: 2740983853.0
num_examples: 13607
download_size: 18905736290
dataset_size: 69114836131.0
---
# Dataset Card for "quesst14_all_synth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FaalSa/dataE | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 57629
num_examples: 1
- name: validation
num_bytes: 58109
num_examples: 1
- name: test
num_bytes: 58589
num_examples: 1
download_size: 12910
dataset_size: 174327
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Maikin023/piuvoz | ---
license: openrail
---
|
tollefj/sickr-sts-NOB | ---
license: cc-by-4.0
---
# Translated STS dataset to Norwegian Bokmål
Machine translated using the *No language left behind* model series, specifically the 1.3B variant: https://huggingface.co/facebook/nllb-200-distilled-1.3B |
davanstrien/model_cards_with_readmes_sections | ---
dataset_info:
features:
- name: license
dtype: string
- name: tags
dtype: string
- name: is_nc
dtype: bool
- name: readme_section
dtype: string
- name: hash
dtype: string
splits:
- name: train
num_bytes: 28801782.8572217
num_examples: 32124
download_size: 13668782
dataset_size: 28801782.8572217
---
# Dataset Card for "model_cards_with_readmes_sections"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huggingartists/scriptonite | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/scriptonite"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 1.251394 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/411d50392aef867fe0e9dd55a074ecfb.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/scriptonite">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Скриптонит (Scriptonite)</div>
<a href="https://genius.com/artists/scriptonite">
<div style="text-align: center; font-size: 14px;">@scriptonite</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/scriptonite).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/scriptonite")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|367| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/scriptonite")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
HuggingFaceH4/ultrachat_200k | ---
language:
- en
license: mit
size_categories:
- 100K<n<1M
task_categories:
- text-generation
pretty_name: UltraChat 200k
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
- split: train_gen
path: data/train_gen-*
- split: test_gen
path: data/test_gen-*
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 1397058554
num_examples: 207865
- name: test_sft
num_bytes: 154695659
num_examples: 23110
- name: train_gen
num_bytes: 1347396812
num_examples: 256032
- name: test_gen
num_bytes: 148276089
num_examples: 28304
download_size: 1624049723
dataset_size: 3047427114
---
# Dataset Card for UltraChat 200k
## Dataset Description
This is a heavily filtered version of the [UltraChat](https://github.com/thunlp/UltraChat) dataset and was used to train [Zephyr-7B-β](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta), a state of the art 7b chat model.
The original datasets consists of 1.4M dialogues generated by ChatGPT and spanning a wide range of topics. To create `UltraChat 200k`, we applied the following logic:
- Selection of a subset of data for faster supervised fine tuning.
- Truecasing of the dataset, as we observed around 5% of the data contained grammatical errors like "Hello. how are you?" instead of "Hello. How are you?"
- Removal of dialogues where the assistant replies with phrases like "I do not have emotions" or "I don't have opinions", even for fact-based prompts that don't involve either.
## Dataset Structure
The dataset has four splits, suitable for:
* Supervised fine-tuning (`sft`).
* Generation ranking (`gen`) via techniques like rejection sampling or PPO.
The number of examples per split is shown as follows:
| train_sft | test_sft | train_gen | test_gen |
|:-------:|:-----------:|:-----:| :-----:|
| 207865 | 23110 | 256032 | 28304 |
The dataset is stored in parquet format with each entry using the following schema:
```
{
"prompt": "Create a fully-developed protagonist who is challenged to survive within a dystopian society under the rule of a tyrant. ...",
"messages":[
{
"content": "Create a fully-developed protagonist who is challenged to survive within a dystopian society under the rule of a tyrant. ...",
"role": "user"
},
{
"content": "Name: Ava\n\n Ava was just 16 years old when the world as she knew it came crashing down. The government had collapsed, leaving behind a chaotic and lawless society. ...",
"role": "assistant"
},
{
"content": "Wow, Ava's story is so intense and inspiring! Can you provide me with more details. ...",
"role": "user"
},
{
"content": "Certainly! ....",
"role": "assistant"
},
{
"content": "That's really interesting! I would love to hear more...",
"role": "user"
}
{
"content": "Certainly! ....",
"role": "assistant"
},
],
"prompt_id": "d938b65dfe31f05f80eb8572964c6673eddbd68eff3db6bd234d7f1e3b86c2af"
}
```
## Citation
If you find this dataset is useful in your work, please cite the original UltraChat dataset:
```
@misc{ding2023enhancing,
title={Enhancing Chat Language Models by Scaling High-quality Instructional Conversations},
author={Ning Ding and Yulin Chen and Bokai Xu and Yujia Qin and Zhi Zheng and Shengding Hu and Zhiyuan Liu and Maosong Sun and Bowen Zhou},
year={2023},
eprint={2305.14233},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
You may also wish to cite the Zephyr 7B technical report:
```
@misc{tunstall2023zephyr,
title={Zephyr: Direct Distillation of LM Alignment},
author={Lewis Tunstall and Edward Beeching and Nathan Lambert and Nazneen Rajani and Kashif Rasul and Younes Belkada and Shengyi Huang and Leandro von Werra and Clémentine Fourrier and Nathan Habib and Nathan Sarrazin and Omar Sanseviero and Alexander M. Rush and Thomas Wolf},
year={2023},
eprint={2310.16944},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
``` |
liuyanchen1015/MULTI_VALUE_sst2_comparative_more_and | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 3080
num_examples: 19
- name: test
num_bytes: 6036
num_examples: 38
- name: train
num_bytes: 73392
num_examples: 631
download_size: 35653
dataset_size: 82508
---
# Dataset Card for "MULTI_VALUE_sst2_comparative_more_and"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FINNUMBER/FINCH_TRAIN_QA_1200_per400_NEWFORMAT | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 5478684
num_examples: 1200
download_size: 2970119
dataset_size: 5478684
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Pinhamusic/candicegomes3 | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.