datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
CJWeiss/multishort | ---
dataset_info:
features:
- name: id
dtype: string
- name: sources
sequence: string
- name: summary/long
dtype: string
- name: summary/short
dtype: string
- name: summary/tiny
dtype: string
splits:
- name: train
num_bytes: 949594524.2185664
num_examples: 2340
- name: test
num_bytes: 189516235.24229074
num_examples: 486
- name: valid
num_bytes: 137063421.14537445
num_examples: 312
download_size: 762638149
dataset_size: 1276174180.6062317
---
# Dataset Card for "multishort"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_L-R__LLmRa-2.7B | ---
pretty_name: Evaluation run of L-R/LLmRa-2.7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [L-R/LLmRa-2.7B](https://huggingface.co/L-R/LLmRa-2.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_L-R__LLmRa-2.7B_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T14:52:35.782186](https://huggingface.co/datasets/open-llm-leaderboard/details_L-R__LLmRa-2.7B_public/blob/main/results_2023-11-13T14-52-35.782186.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2619182180653927,\n\
\ \"acc_stderr\": 0.031054877346083407,\n \"acc_norm\": 0.2636967484818349,\n\
\ \"acc_norm_stderr\": 0.031856551298856575,\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602581,\n \"mc2\": 0.3522535522108365,\n\
\ \"mc2_stderr\": 0.01379814047299605,\n \"em\": 0.0009437919463087249,\n\
\ \"em_stderr\": 0.0003144653119413285,\n \"f1\": 0.04760067114093977,\n\
\ \"f1_stderr\": 0.0011764663842453984\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.32081911262798635,\n \"acc_stderr\": 0.013640943091946526,\n\
\ \"acc_norm\": 0.3703071672354949,\n \"acc_norm_stderr\": 0.01411129875167495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4561840270862378,\n\
\ \"acc_stderr\": 0.004970585328297622,\n \"acc_norm\": 0.6064528978291177,\n\
\ \"acc_norm_stderr\": 0.0048753793520798245\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.033176727875331574,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.033176727875331574\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.0256042334708991,\n\
\ \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.0256042334708991\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n\
\ \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23015873015873015,\n \"acc_stderr\": 0.021679219663693135,\n \"\
acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.021679219663693135\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287394,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.19032258064516128,\n \"acc_stderr\": 0.02233170761182307,\n \"\
acc_norm\": 0.19032258064516128,\n \"acc_norm_stderr\": 0.02233170761182307\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868408,\n \"\
acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868408\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586804,\n \"\
acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586804\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752943,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752943\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2358974358974359,\n \"acc_stderr\": 0.021525965407408726,\n\
\ \"acc_norm\": 0.2358974358974359,\n \"acc_norm_stderr\": 0.021525965407408726\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868956,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.03543304234389985,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.03543304234389985\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23302752293577983,\n \"acc_stderr\": 0.018125669180861507,\n \"\
acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.018125669180861507\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2175925925925926,\n \"acc_stderr\": 0.028139689444859693,\n \"\
acc_norm\": 0.2175925925925926,\n \"acc_norm_stderr\": 0.028139689444859693\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.29535864978902954,\n \"acc_stderr\": 0.029696338713422893,\n \
\ \"acc_norm\": 0.29535864978902954,\n \"acc_norm_stderr\": 0.029696338713422893\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21076233183856502,\n\
\ \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.21076233183856502,\n\
\ \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.33980582524271846,\n \"acc_stderr\": 0.046897659372781335,\n\
\ \"acc_norm\": 0.33980582524271846,\n \"acc_norm_stderr\": 0.046897659372781335\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23076923076923078,\n\
\ \"acc_stderr\": 0.027601921381417614,\n \"acc_norm\": 0.23076923076923078,\n\
\ \"acc_norm_stderr\": 0.027601921381417614\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26309067688378035,\n\
\ \"acc_stderr\": 0.015745497169049046,\n \"acc_norm\": 0.26309067688378035,\n\
\ \"acc_norm_stderr\": 0.015745497169049046\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.023176298203992002,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.023176298203992002\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.0248480182638752,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.0248480182638752\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3440514469453376,\n\
\ \"acc_stderr\": 0.026981478043648022,\n \"acc_norm\": 0.3440514469453376,\n\
\ \"acc_norm_stderr\": 0.026981478043648022\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.20987654320987653,\n \"acc_stderr\": 0.02265834408598136,\n\
\ \"acc_norm\": 0.20987654320987653,\n \"acc_norm_stderr\": 0.02265834408598136\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590634,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590634\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24445893089960888,\n\
\ \"acc_stderr\": 0.010976425013113899,\n \"acc_norm\": 0.24445893089960888,\n\
\ \"acc_norm_stderr\": 0.010976425013113899\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19117647058823528,\n \"acc_stderr\": 0.023886881922440345,\n\
\ \"acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.023886881922440345\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528044,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528044\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.36363636363636365,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.36363636363636365,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.29850746268656714,\n\
\ \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.29850746268656714,\n\
\ \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n\
\ \"acc_stderr\": 0.033844291552331346,\n \"acc_norm\": 0.25301204819277107,\n\
\ \"acc_norm_stderr\": 0.033844291552331346\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602581,\n \"mc2\": 0.3522535522108365,\n\
\ \"mc2_stderr\": 0.01379814047299605\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6156274664561957,\n \"acc_stderr\": 0.01367156760083619\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0009437919463087249,\n \
\ \"em_stderr\": 0.0003144653119413285,\n \"f1\": 0.04760067114093977,\n\
\ \"f1_stderr\": 0.0011764663842453984\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.003032600454890068,\n \"acc_stderr\": 0.0015145735612245427\n\
\ }\n}\n```"
repo_url: https://huggingface.co/L-R/LLmRa-2.7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|arc:challenge|25_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|drop|3_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|gsm8k|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hellaswag|10_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T14-52-35.782186.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T14-52-35.782186.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- '**/details_harness|winogrande|5_2023-11-13T14-52-35.782186.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T14-52-35.782186.parquet'
- config_name: results
data_files:
- split: 2023_11_13T14_52_35.782186
path:
- results_2023-11-13T14-52-35.782186.parquet
- split: latest
path:
- results_2023-11-13T14-52-35.782186.parquet
---
# Dataset Card for Evaluation run of L-R/LLmRa-2.7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/L-R/LLmRa-2.7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [L-R/LLmRa-2.7B](https://huggingface.co/L-R/LLmRa-2.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_L-R__LLmRa-2.7B_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T14:52:35.782186](https://huggingface.co/datasets/open-llm-leaderboard/details_L-R__LLmRa-2.7B_public/blob/main/results_2023-11-13T14-52-35.782186.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2619182180653927,
"acc_stderr": 0.031054877346083407,
"acc_norm": 0.2636967484818349,
"acc_norm_stderr": 0.031856551298856575,
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602581,
"mc2": 0.3522535522108365,
"mc2_stderr": 0.01379814047299605,
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413285,
"f1": 0.04760067114093977,
"f1_stderr": 0.0011764663842453984
},
"harness|arc:challenge|25": {
"acc": 0.32081911262798635,
"acc_stderr": 0.013640943091946526,
"acc_norm": 0.3703071672354949,
"acc_norm_stderr": 0.01411129875167495
},
"harness|hellaswag|10": {
"acc": 0.4561840270862378,
"acc_stderr": 0.004970585328297622,
"acc_norm": 0.6064528978291177,
"acc_norm_stderr": 0.0048753793520798245
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.033176727875331574,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.033176727875331574
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.0256042334708991,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.0256042334708991
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436695,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436695
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.021679219663693135,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.021679219663693135
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287394,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.19032258064516128,
"acc_stderr": 0.02233170761182307,
"acc_norm": 0.19032258064516128,
"acc_norm_stderr": 0.02233170761182307
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586804,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586804
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752943,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752943
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2358974358974359,
"acc_stderr": 0.021525965407408726,
"acc_norm": 0.2358974358974359,
"acc_norm_stderr": 0.021525965407408726
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868956,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.03543304234389985,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.03543304234389985
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23302752293577983,
"acc_stderr": 0.018125669180861507,
"acc_norm": 0.23302752293577983,
"acc_norm_stderr": 0.018125669180861507
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2175925925925926,
"acc_stderr": 0.028139689444859693,
"acc_norm": 0.2175925925925926,
"acc_norm_stderr": 0.028139689444859693
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29535864978902954,
"acc_stderr": 0.029696338713422893,
"acc_norm": 0.29535864978902954,
"acc_norm_stderr": 0.029696338713422893
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21076233183856502,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.21076233183856502,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.33980582524271846,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.33980582524271846,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.027601921381417614,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.027601921381417614
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26309067688378035,
"acc_stderr": 0.015745497169049046,
"acc_norm": 0.26309067688378035,
"acc_norm_stderr": 0.015745497169049046
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.023176298203992002,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.023176298203992002
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3440514469453376,
"acc_stderr": 0.026981478043648022,
"acc_norm": 0.3440514469453376,
"acc_norm_stderr": 0.026981478043648022
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.20987654320987653,
"acc_stderr": 0.02265834408598136,
"acc_norm": 0.20987654320987653,
"acc_norm_stderr": 0.02265834408598136
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590634,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590634
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24445893089960888,
"acc_stderr": 0.010976425013113899,
"acc_norm": 0.24445893089960888,
"acc_norm_stderr": 0.010976425013113899
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19117647058823528,
"acc_stderr": 0.023886881922440345,
"acc_norm": 0.19117647058823528,
"acc_norm_stderr": 0.023886881922440345
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528044,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528044
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.29850746268656714,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.29850746268656714,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25301204819277107,
"acc_stderr": 0.033844291552331346,
"acc_norm": 0.25301204819277107,
"acc_norm_stderr": 0.033844291552331346
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602581,
"mc2": 0.3522535522108365,
"mc2_stderr": 0.01379814047299605
},
"harness|winogrande|5": {
"acc": 0.6156274664561957,
"acc_stderr": 0.01367156760083619
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413285,
"f1": 0.04760067114093977,
"f1_stderr": 0.0011764663842453984
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245427
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AISE-TUDelft/Capybara | ---
configs:
- config_name: default
data_files:
- split: dedup_C
path: data/dedup_C-*
- split: dup_C
path: data/dup_C-*
- split: dedup_DecomC
path: data/dedup_DecomC-*
- split: dup_DecomC
path: data/dup_DecomC-*
- split: dedup_demiStripped
path: data/dedup_demiStripped-*
- split: dup_demiStripped
path: data/dup_demiStripped-*
- split: no_fun_demiStripped
path: data/no_fun_demiStripped-*
- split: dup_stripped
path: data/dup_stripped-*
- split: dedup_stripped
path: data/dedup_stripped-*
dataset_info:
features:
- name: id
dtype: int64
- name: docstring_tokens
sequence: string
- name: code_tokens
sequence: string
- name: fun_name
dtype: string
- name: repo
dtype: string
- name: starting
dtype: string
- name: partition
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: dedup_C
num_bytes: 167770495
num_examples: 79673
- name: dup_C
num_bytes: 348707539
num_examples: 214587
- name: dedup_DecomC
num_bytes: 330052224
num_examples: 79673
- name: dup_DecomC
num_bytes: 614158883
num_examples: 214587
- name: dedup_demiStripped
num_bytes: 316991021
num_examples: 79673
- name: dup_demiStripped
num_bytes: 590234671
num_examples: 214587
- name: no_fun_demiStripped
num_bytes: 606914210
num_examples: 214587
- name: dup_stripped
num_bytes: 60563000
num_examples: 14245
- name: dedup_stripped
num_bytes: 40485701
num_examples: 7826
download_size: 592873091
dataset_size: 3075877744
license: apache-2.0
task_categories:
- summarization
tags:
- code
- Reverse Engineering
- Binary
- Code Summarization
size_categories:
- 100K<n<1M
---
# Dataset Card for "Capybara"
## Dataset Description
- **Repository: https://github.com/AISE-TUDelft/Capybara-BinT5**
- **Paper: https://huggingface.co/papers/2301.01701**
- **Point of Contact: https://huggingface.co/aalkaswan**
- **Raw Data: https://zenodo.org/records/7229809**
### Dataset Summary
Dataset used to train [BinT5](https://huggingface.co/collections/AISE-TUDelft/bint5-65bd006a8c90bd5c97485244). Please refer to the paper for more information.
### Citation Information
```
@inproceedings{alkaswan2023extending,
title={Extending Source Code Pre-Trained Language Models to Summarise Decompiled Binaries},
author={Al-Kaswan, Ali and Ahmed, Toufique and Izadi, Maliheh and Sawant, Anand Ashok and Devanbu, Premkumar and van Deursen, Arie},
booktitle={2023 IEEE International Conference on Software Analysis, Evolution and Reengineering (SANER)},
pages={260--271},
year={2023},
organization={IEEE}
}
``` |
gryffindor-ISWS/CLIP_metrics_img-img | ---
license: gpl-3.0
language:
- en
--- |
Kavinprasanth/demo_dataset | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: string
splits:
- name: train
num_bytes: 11050
num_examples: 50
download_size: 6456
dataset_size: 11050
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
shubhamagarwal92/rw_2308_filtered | ---
dataset_info:
features:
- name: aid
dtype: string
- name: mid
dtype: string
- name: abstract
dtype: string
- name: corpusid
dtype: int64
- name: text_except_rw
dtype: string
- name: title
dtype: string
- name: related_work
dtype: string
- name: original_related_work
dtype: string
- name: ref_abstract
struct:
- name: abstract
sequence: string
- name: cite_N
sequence: string
- name: corpursid
sequence: string
- name: ref_abstract_original
struct:
- name: abstract
sequence: string
- name: cite_N
sequence: string
- name: corpursid
sequence: string
- name: ref_abstract_full_text
struct:
- name: abstract
sequence: string
- name: all_para_text
sequence: string
- name: cite_N
sequence: string
- name: corpursid
sequence: string
- name: ref_abstract_full_text_original
struct:
- name: abstract
sequence: string
- name: all_para_text
sequence: string
- name: cite_N
sequence: string
- name: corpursid
sequence: string
- name: total_cites
dtype: int64
splits:
- name: test
num_bytes: 254996014
num_examples: 1000
download_size: 106899160
dataset_size: 254996014
---
# Dataset Card for "rw_2308_filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/data-standardized_cluster_8_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 8844836
num_examples: 4421
download_size: 3644121
dataset_size: 8844836
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_8_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lowo/ncep-TestData2 | ---
license: mit
---
|
zhangshuoming/x86_c_O0_exebench_json_cleaned | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1495247998.4292047
num_examples: 679665
download_size: 195075844
dataset_size: 1495247998.4292047
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "x86_c_O0_exebench_json_cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
quocanh34/soict_train_non_value_new | ---
dataset_info:
features:
- name: id
dtype: string
- name: sentence
dtype: string
- name: intent
dtype: string
- name: sentence_annotation
dtype: string
- name: entities
list:
- name: type
dtype: string
- name: filler
dtype: string
- name: file
dtype: string
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: origin_transcription
dtype: string
- name: sentence_norm
dtype: string
- name: w2v2_base_5grams_transcription
dtype: string
- name: w2v2_large_5grams_transcription
dtype: string
splits:
- name: train
download_size: 1881
dataset_size: 0.0
---
# Dataset Card for "soict_train_non_value_new"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OpenEmpathic/Emotions-eng | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_radm__Philosophy-Platypus2-13b | ---
pretty_name: Evaluation run of radm/Philosophy-Platypus2-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [radm/Philosophy-Platypus2-13b](https://huggingface.co/radm/Philosophy-Platypus2-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_radm__Philosophy-Platypus2-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T00:45:24.163346](https://huggingface.co/datasets/open-llm-leaderboard/details_radm__Philosophy-Platypus2-13b/blob/main/results_2023-08-29T00%3A45%3A24.163346.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.5437981691869808,\n \"\
acc_stderr\": 0.03484311795554624,\n \"acc_norm\": 0.547878610439407,\n \
\ \"acc_norm_stderr\": 0.034826606717822575,\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.37335488461829447,\n\
\ \"mc2_stderr\": 0.014112790281285795\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5477815699658704,\n \"acc_stderr\": 0.014544519880633822,\n\
\ \"acc_norm\": 0.5861774744027304,\n \"acc_norm_stderr\": 0.014392730009221004\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5828520215096594,\n\
\ \"acc_stderr\": 0.004920800313232742,\n \"acc_norm\": 0.785202150965943,\n\
\ \"acc_norm_stderr\": 0.0040984271589492634\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n\
\ \"acc_stderr\": 0.03807301726504511,\n \"acc_norm\": 0.47398843930635837,\n\
\ \"acc_norm_stderr\": 0.03807301726504511\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.026860206444724352,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.026860206444724352\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406796,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406796\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.03646204963253812,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.03646204963253812\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270286,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270286\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.02483881198803316,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.02483881198803316\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n\
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.032219436365661956,\n\
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.032219436365661956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7688073394495413,\n \"acc_stderr\": 0.018075750241633146,\n \"\
acc_norm\": 0.7688073394495413,\n \"acc_norm_stderr\": 0.018075750241633146\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7215189873417721,\n \"acc_stderr\": 0.02917868230484253,\n\
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.02917868230484253\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.04320767807536671,\n \"\
acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.04320767807536671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5950920245398773,\n \"acc_stderr\": 0.03856672163548913,\n\
\ \"acc_norm\": 0.5950920245398773,\n \"acc_norm_stderr\": 0.03856672163548913\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291518,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291518\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503947,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503947\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7008547008547008,\n\
\ \"acc_stderr\": 0.02999695185834949,\n \"acc_norm\": 0.7008547008547008,\n\
\ \"acc_norm_stderr\": 0.02999695185834949\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7420178799489144,\n\
\ \"acc_stderr\": 0.01564583018834895,\n \"acc_norm\": 0.7420178799489144,\n\
\ \"acc_norm_stderr\": 0.01564583018834895\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5982658959537572,\n \"acc_stderr\": 0.026394104177643637,\n\
\ \"acc_norm\": 0.5982658959537572,\n \"acc_norm_stderr\": 0.026394104177643637\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3463687150837989,\n\
\ \"acc_stderr\": 0.015913546784020117,\n \"acc_norm\": 0.3463687150837989,\n\
\ \"acc_norm_stderr\": 0.015913546784020117\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.028245134024387303,\n\
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.028245134024387303\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.027316847674192707,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.027316847674192707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.02672586880910079,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.02672586880910079\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.029427994039419994,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.029427994039419994\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40352020860495436,\n\
\ \"acc_stderr\": 0.012530241301193186,\n \"acc_norm\": 0.40352020860495436,\n\
\ \"acc_norm_stderr\": 0.012530241301193186\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.030254372573976715,\n\
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.030254372573976715\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5343137254901961,\n \"acc_stderr\": 0.020180144843307293,\n \
\ \"acc_norm\": 0.5343137254901961,\n \"acc_norm_stderr\": 0.020180144843307293\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5959183673469388,\n \"acc_stderr\": 0.03141470802586589,\n\
\ \"acc_norm\": 0.5959183673469388,\n \"acc_norm_stderr\": 0.03141470802586589\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.031871875379197966,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.031871875379197966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.37335488461829447,\n\
\ \"mc2_stderr\": 0.014112790281285795\n }\n}\n```"
repo_url: https://huggingface.co/radm/Philosophy-Platypus2-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|arc:challenge|25_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hellaswag|10_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T00:45:24.163346.parquet'
- config_name: results
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- results_2023-08-29T00:45:24.163346.parquet
- split: latest
path:
- results_2023-08-29T00:45:24.163346.parquet
---
# Dataset Card for Evaluation run of radm/Philosophy-Platypus2-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/radm/Philosophy-Platypus2-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [radm/Philosophy-Platypus2-13b](https://huggingface.co/radm/Philosophy-Platypus2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_radm__Philosophy-Platypus2-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T00:45:24.163346](https://huggingface.co/datasets/open-llm-leaderboard/details_radm__Philosophy-Platypus2-13b/blob/main/results_2023-08-29T00%3A45%3A24.163346.json):
```python
{
"all": {
"acc": 0.5437981691869808,
"acc_stderr": 0.03484311795554624,
"acc_norm": 0.547878610439407,
"acc_norm_stderr": 0.034826606717822575,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.37335488461829447,
"mc2_stderr": 0.014112790281285795
},
"harness|arc:challenge|25": {
"acc": 0.5477815699658704,
"acc_stderr": 0.014544519880633822,
"acc_norm": 0.5861774744027304,
"acc_norm_stderr": 0.014392730009221004
},
"harness|hellaswag|10": {
"acc": 0.5828520215096594,
"acc_stderr": 0.004920800313232742,
"acc_norm": 0.785202150965943,
"acc_norm_stderr": 0.0040984271589492634
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.03807301726504511,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.03807301726504511
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724352,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724352
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406796,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406796
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.03646204963253812,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.03646204963253812
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270286,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6,
"acc_stderr": 0.02483881198803316,
"acc_norm": 0.6,
"acc_norm_stderr": 0.02483881198803316
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.032219436365661956,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.032219436365661956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7688073394495413,
"acc_stderr": 0.018075750241633146,
"acc_norm": 0.7688073394495413,
"acc_norm_stderr": 0.018075750241633146
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.02917868230484253,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.02917868230484253
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.04320767807536671,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.04320767807536671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5950920245398773,
"acc_stderr": 0.03856672163548913,
"acc_norm": 0.5950920245398773,
"acc_norm_stderr": 0.03856672163548913
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291518,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291518
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503947,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503947
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7008547008547008,
"acc_stderr": 0.02999695185834949,
"acc_norm": 0.7008547008547008,
"acc_norm_stderr": 0.02999695185834949
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7420178799489144,
"acc_stderr": 0.01564583018834895,
"acc_norm": 0.7420178799489144,
"acc_norm_stderr": 0.01564583018834895
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5982658959537572,
"acc_stderr": 0.026394104177643637,
"acc_norm": 0.5982658959537572,
"acc_norm_stderr": 0.026394104177643637
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3463687150837989,
"acc_stderr": 0.015913546784020117,
"acc_norm": 0.3463687150837989,
"acc_norm_stderr": 0.015913546784020117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.028245134024387303,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.028245134024387303
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.027316847674192707,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.027316847674192707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.02672586880910079,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.02672586880910079
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.029427994039419994,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.029427994039419994
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40352020860495436,
"acc_stderr": 0.012530241301193186,
"acc_norm": 0.40352020860495436,
"acc_norm_stderr": 0.012530241301193186
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.030254372573976715,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.030254372573976715
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5343137254901961,
"acc_stderr": 0.020180144843307293,
"acc_norm": 0.5343137254901961,
"acc_norm_stderr": 0.020180144843307293
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5959183673469388,
"acc_stderr": 0.03141470802586589,
"acc_norm": 0.5959183673469388,
"acc_norm_stderr": 0.03141470802586589
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.031871875379197966,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.031871875379197966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.37335488461829447,
"mc2_stderr": 0.014112790281285795
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hasarinduperera/bioluminescence-image-dataset | ---
license: openrail
---
|
pharaouk/algorithmic-reasoning-seed | ---
license: mit
task_categories:
- text-generation
- question-answering
language:
- en
tags:
- code
size_categories:
- n<1K
---
# Dataset Card for Algorithmic Reasoning (seed)
**Note: This dataset is WIP and most question's answer section is empty or incomplete! See also "Other Known Limitations" section**
**Warning: If you somehow do use this dataset, remember to NOT do any eval after training on the questions in this dataset!**
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** lemontea.Tom@gmail.com or https://github.com/lemonteaa
### Dataset Summary
Dataset to help LLM learn how to reason about code, especially on algorithmic tasks, by seeing human demostration.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
- Question title
- Question
- Thought - Internal thought process that reason step by step/in an organized manner
- Answer presented to user (proof or code) - with explanation if necessary
### Data Splits
No split as of now - all are in the training section.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
Questions are those I personally remember in my career, selected based on:
- interesting
- involving CS, math, or similar knowledge
- Target specific known weaknesses of existing open source/source available LLM (eg index notation handling)
- pratical/likely to appear in production work settings
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
Manually created by me entirely, writing in a level of details exceeeding what usually appears on the internet (bootcamp/FANNG interview prep/leetcode style training website etc) to help AI/LLM access knowledge that may be too obvious to human to write down.
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
None as they are general, objective knowledge.
## Considerations for Using the Data
### Social Impact of Dataset
Although it is doubtful this dataset can actually work, in the event it does this may result in enhancing coding capability of LLM (which is intended), but which may create downstream effect simply due to LLM capability enhancement.
### Discussion of Biases
As questions are selected partly based on my taste, areas in CS that I am not interested in may be underrepresented.
### Other Known Limitations
- While I try to cover various mainstream programming language, each problem target only one specific language.
- It is currently in free-style markdown file. Maybe could make a script to convert to more structured format.
- Questions are asked in a conversational tone instead of leetcode style with strict I/O specification, hence may be more suitable for human eval than automated eval (eg extract and run code output in sandbox against test case automatically).
- As the dataset is completely manually created by a single human, the dataset size is extremely small.
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
MIT
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
HuggingFaceM4/FairFace | ---
license: cc-by-4.0
dataset_info:
- config_name: '0.25'
features:
- name: image
dtype: image
- name: age
dtype:
class_label:
names:
'0': 0-2
'1': 3-9
'2': 10-19
'3': 20-29
'4': 30-39
'5': 40-49
'6': 50-59
'7': 60-69
'8': more than 70
- name: gender
dtype:
class_label:
names:
'0': Male
'1': Female
- name: race
dtype:
class_label:
names:
'0': East Asian
'1': Indian
'2': Black
'3': White
'4': Middle Eastern
'5': Latino_Hispanic
'6': Southeast Asian
- name: service_test
dtype: bool
splits:
- name: train
num_bytes: 512915534.352
num_examples: 86744
- name: validation
num_bytes: 64453996.096
num_examples: 10954
download_size: 563437634
dataset_size: 577369530.448
- config_name: '1.25'
features:
- name: image
dtype: image
- name: age
dtype:
class_label:
names:
'0': 0-2
'1': 3-9
'2': 10-19
'3': 20-29
'4': 30-39
'5': 40-49
'6': 50-59
'7': 60-69
'8': more than 70
- name: gender
dtype:
class_label:
names:
'0': Male
'1': Female
- name: race
dtype:
class_label:
names:
'0': East Asian
'1': Indian
'2': Black
'3': White
'4': Middle Eastern
'5': Latino_Hispanic
'6': Southeast Asian
- name: service_test
dtype: bool
splits:
- name: train
num_bytes: 1860154641.104
num_examples: 86744
- name: validation
num_bytes: 236712623.794
num_examples: 10954
download_size: 2104494732
dataset_size: 2096867264.898
configs:
- config_name: '0.25'
data_files:
- split: train
path: 0.25/train-*
- split: validation
path: 0.25/validation-*
- config_name: '1.25'
data_files:
- split: train
path: 1.25/train-*
- split: validation
path: 1.25/validation-*
---
# Dataset Card for FairFace
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/joojs/fairface](https://github.com/joojs/fairface)
- **Repository:** [https://github.com/joojs/fairface](https://github.com/joojs/fairface)
- **Paper:** [https://openaccess.thecvf.com/content/WACV2021/papers/Karkkainen_FairFace_Face_Attribute_Dataset_for_Balanced_Race_Gender_and_Age_WACV_2021_paper.pdf](https://openaccess.thecvf.com/content/WACV2021/papers/Karkkainen_FairFace_Face_Attribute_Dataset_for_Balanced_Race_Gender_and_Age_WACV_2021_paper.pdf)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
FairFace is a face image dataset which is race balanced. It contains 108,501 images from 7 different race groups: White, Black, Indian, East Asian, Southeast Asian, Middle Eastern, and Latino.
Images were collected from the YFCC-100M Flickr dataset and labeled with race, gender, and age groups.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
Each instance has the following structure:
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=448x448 at 0x7FCABA221FA0>,
'age': 6,
'gender': 0,
'race': 0,
'service_test': True
}
```
### Data Fields
- `image`: The image
- `age`: Age class among `["0-2", "3-9", "10-19", "20-29", "30-39", "40-49", "50-59", "60-69", "more than 70"]`
- `gender`: Gender class among `["Male", "Female"]`
- `race`: Race class among `["East Asian", "Indian", "Black", "White", "Middle Eastern", "Latino_Hispanic", "Southeast Asian"]`
- `service_test`: Not sure what this is. See [issue](https://github.com/joojs/fairface/issues/9).
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@VictorSanh](https://github.com/VictorSanh) for adding this dataset. |
NaiveNeuron/wikigoldsk | ---
license: cc-by-sa-3.0
---
# Dataset Card for WikiGoldSK
- **Repository:** [https://github.com/NaiveNeuron/WikiGoldSK](https://github.com/NaiveNeuron/WikiGoldSK)
- **Paper:** [https://arxiv.org/abs/2304.04026](https://arxiv.org/abs/2304.04026)
### Dataset Summary
WikiGoldSK is manually annotated slovak NER dataset created from Wikipedia.
It contains more than 10k named entities from categories PER, LOC, ORG and MISC in IOB2 format.
### Citation Information
```
@inproceedings{}
```
|
JetBrains-Research/lca-codegen-medium | ---
dataset_info:
features:
- name: repo
dtype: string
- name: commit_hash
dtype: string
- name: completion_file
struct:
- name: filename
dtype: string
- name: content
dtype: string
- name: completion_lines
struct:
- name: infile
sequence: int32
- name: inproject
sequence: int32
- name: common
sequence: int32
- name: commited
sequence: int32
- name: non_informative
sequence: int32
- name: random
sequence: int32
- name: repo_snapshot
sequence:
- name: filename
dtype: string
- name: content
dtype: string
- name: completion_lines_raw
struct:
- name: commited
sequence: int64
- name: common
sequence: int64
- name: infile
sequence: int64
- name: inproject
sequence: int64
- name: non_informative
sequence: int64
- name: other
sequence: int64
splits:
- name: test
num_bytes: 514928459
num_examples: 224
download_size: 225824560
dataset_size: 514928459
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# LCA Project Level Code Completion
## How to load the dataset
```
from datasets import load_dataset
ds = load_dataset('JetBrains-Research/lca-codegen-medium', split='test')
```
## Data Point Structure
* `repo` -- repository name in format `{GitHub_user_name}__{repository_name}`
* `commit_hash` -- commit hash
* `completion_file` -- dictionary with the completion file content in the following format:
* `filename` -- filepath to the completion file
* `content` -- content of the completion file
* `completion_lines` -- dictionary where keys are classes of lines and values are a list of integers (numbers of lines to complete). The classes are:
* `committed` -- line contains at least one function or class that was declared in the committed files
* `inproject` -- line contains at least one function and class that was declared in the project (excluding previous)
* `infile` -- line contains at least one function and class that was declared in the completion file (excluding previous)
* `common` -- line contains at least one function and class that was classified to be common, e.g. `main`, `get`, etc (excluding previous)
* `non_informative` -- line that was classified to be non-informative, e.g. too short, contains comments, etc
* `random` -- randomly sampled from the rest of the lines
* `repo_snapshot` -- dictionary with a snapshot of the repository before the commit. Has the same structure as `completion_file`, but filenames and contents are orginized as lists.
* `completion_lines_raw` -- the same as `completion_lines`, but before sampling.
## How we collected the data
* TBA |
CyberHarem/hoshino_ai_oshinoko | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Hoshino Ai
This is the dataset of Hoshino Ai, containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 435 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 435 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 435 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 435 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
swaroopajit/next-dataset-refined-batch-5000 | ---
dataset_info:
features:
- name: caption
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 307226208.0
num_examples: 1000
download_size: 278805299
dataset_size: 307226208.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "next-dataset-refined-batch-5000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fathyshalab/reklamation24_unternehmen-verbaende-full | ---
dataset_info:
features:
- name: text
dtype: string
- name: inputs
struct:
- name: text
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
dtype: string
- name: annotation_agent
dtype: string
- name: vectors
struct:
- name: mini-lm-sentence-transformers
sequence: float64
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 28216223
num_examples: 5336
download_size: 0
dataset_size: 28216223
---
# Dataset Card for "reklamation24_unternehmen-verbaende-full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
torchgeo/bathymetry | ---
license: cc-by-4.0
size_categories:
- 10K<n<100K
---
This dataset contains 8 NetCDF files.
Ground truth derived from [CRUST1.0](https://igppweb.ucsd.edu/~gabi/crust1.html):
* truth.nc
Predictions made by plate models:
* hs.nc
* psm.nc
* gdh1.nc
* h13.nc
Predictions made by ML models:
* ridge.nc
* svr.nc
* mlp.nc |
open-llm-leaderboard/details_uukuguy__speechless-codellama-dolphin-orca-platypus-34b | ---
pretty_name: Evaluation run of uukuguy/speechless-codellama-dolphin-orca-platypus-34b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-codellama-dolphin-orca-platypus-34b](https://huggingface.co/uukuguy/speechless-codellama-dolphin-orca-platypus-34b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-codellama-dolphin-orca-platypus-34b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T00:32:33.472586](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-dolphin-orca-platypus-34b/blob/main/results_2023-10-29T00-32-33.472586.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.37080536912751677,\n\
\ \"em_stderr\": 0.004946581424326503,\n \"f1\": 0.42342072147651116,\n\
\ \"f1_stderr\": 0.004815729646559334,\n \"acc\": 0.439759976974257,\n\
\ \"acc_stderr\": 0.011098891058626454\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.37080536912751677,\n \"em_stderr\": 0.004946581424326503,\n\
\ \"f1\": 0.42342072147651116,\n \"f1_stderr\": 0.004815729646559334\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1470811220621683,\n \
\ \"acc_stderr\": 0.0097560636603599\n },\n \"harness|winogrande|5\": {\n\
\ \"acc\": 0.7324388318863457,\n \"acc_stderr\": 0.012441718456893009\n\
\ }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-codellama-dolphin-orca-platypus-34b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T00_32_33.472586
path:
- '**/details_harness|drop|3_2023-10-29T00-32-33.472586.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T00-32-33.472586.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T00_32_33.472586
path:
- '**/details_harness|gsm8k|5_2023-10-29T00-32-33.472586.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T00-32-33.472586.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T00_32_33.472586
path:
- '**/details_harness|winogrande|5_2023-10-29T00-32-33.472586.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T00-32-33.472586.parquet'
- config_name: results
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- results_2023-10-04T00-22-19.968928.parquet
- split: 2023_10_29T00_32_33.472586
path:
- results_2023-10-29T00-32-33.472586.parquet
- split: latest
path:
- results_2023-10-29T00-32-33.472586.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-codellama-dolphin-orca-platypus-34b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-codellama-dolphin-orca-platypus-34b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-codellama-dolphin-orca-platypus-34b](https://huggingface.co/uukuguy/speechless-codellama-dolphin-orca-platypus-34b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-codellama-dolphin-orca-platypus-34b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T00:32:33.472586](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-dolphin-orca-platypus-34b/blob/main/results_2023-10-29T00-32-33.472586.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.37080536912751677,
"em_stderr": 0.004946581424326503,
"f1": 0.42342072147651116,
"f1_stderr": 0.004815729646559334,
"acc": 0.439759976974257,
"acc_stderr": 0.011098891058626454
},
"harness|drop|3": {
"em": 0.37080536912751677,
"em_stderr": 0.004946581424326503,
"f1": 0.42342072147651116,
"f1_stderr": 0.004815729646559334
},
"harness|gsm8k|5": {
"acc": 0.1470811220621683,
"acc_stderr": 0.0097560636603599
},
"harness|winogrande|5": {
"acc": 0.7324388318863457,
"acc_stderr": 0.012441718456893009
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Franman/billetes-argentinos | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '100'
'1': '1000'
'2': '200'
splits:
- name: train
num_bytes: 4908690294.35
num_examples: 2386
download_size: 0
dataset_size: 4908690294.35
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
marcus2000/timelist_dataset4finetuning_conspects | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: Original
dtype: string
- name: Summary
dtype: string
- name: Task
dtype: string
splits:
- name: train
num_bytes: 1049996.152173913
num_examples: 39
- name: test
num_bytes: 188460.84782608695
num_examples: 7
download_size: 588122
dataset_size: 1238457.0
---
# Dataset Card for "timelist_dataset4finetuning_conspects"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tristan/Sample_vqa_test_for_colab | ---
dataset_info:
features:
- name: question_type
dtype: string
- name: multiple_choice_answer
dtype: string
- name: answers
sequence: string
- name: answers_original
list:
- name: answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: id_image
dtype: int64
- name: answer_type
dtype: string
- name: question_id
dtype: int64
- name: question
dtype: string
- name: image
dtype: image
- name: id
dtype: int64
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: DETA_detections_deta_swin_large_o365
list:
- name: box
sequence: float32
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float32
- name: size
dtype: string
- name: blip_caption_False_beams_5_Salesforce_blip_image_captioning_large_max_length_30_hf
dtype: string
- name: blip_caption_Salesforce_blip_image_captioning_large_intensive
sequence: string
- name: DETA_detections_deta_swin_large_o365_caption_all_patches_Salesforce_blip_image_captioning_large_
list:
- name: box
sequence: float64
- name: captions_all_patches
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: clip_tags_ViT_L_14_with_openai
sequence: string
splits:
- name: test
num_bytes: 2746703.0
num_examples: 10
download_size: 2136539
dataset_size: 2746703.0
---
# Dataset Card for "Sample_vqa_test_for_colab"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-126000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1030364
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v1 | ---
pretty_name: Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1](https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T06:20:20.648218](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v1/blob/main/results_2024-01-14T06-20-20.648218.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6191660640057981,\n\
\ \"acc_stderr\": 0.03263652891344978,\n \"acc_norm\": 0.6271945727055741,\n\
\ \"acc_norm_stderr\": 0.03333445432068468,\n \"mc1\": 0.43329253365973075,\n\
\ \"mc1_stderr\": 0.017347024450107492,\n \"mc2\": 0.5997212380160826,\n\
\ \"mc2_stderr\": 0.015696061571327326\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.014131176760131167,\n\
\ \"acc_norm\": 0.6604095563139932,\n \"acc_norm_stderr\": 0.01383903976282017\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6580362477594105,\n\
\ \"acc_stderr\": 0.004733980470799212,\n \"acc_norm\": 0.8462457677753435,\n\
\ \"acc_norm_stderr\": 0.0035997580435468044\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n\
\ \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835771,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835771\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155243,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155243\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110932,\n\
\ \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110932\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612907,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612907\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229962,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229962\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879716,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3865921787709497,\n\
\ \"acc_stderr\": 0.016286674879101022,\n \"acc_norm\": 0.3865921787709497,\n\
\ \"acc_norm_stderr\": 0.016286674879101022\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914389,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914389\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889016,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889016\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43285528031290743,\n\
\ \"acc_stderr\": 0.012654565234622868,\n \"acc_norm\": 0.43285528031290743,\n\
\ \"acc_norm_stderr\": 0.012654565234622868\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928007,\n\
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928007\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6552287581699346,\n \"acc_stderr\": 0.01922832201869664,\n \
\ \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.01922832201869664\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43329253365973075,\n\
\ \"mc1_stderr\": 0.017347024450107492,\n \"mc2\": 0.5997212380160826,\n\
\ \"mc2_stderr\": 0.015696061571327326\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.011585871710209408\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20318423047763456,\n \
\ \"acc_stderr\": 0.011083227665267797\n }\n}\n```"
repo_url: https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|arc:challenge|25_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|gsm8k|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hellaswag|10_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T06-20-20.648218.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T06-20-20.648218.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- '**/details_harness|winogrande|5_2024-01-14T06-20-20.648218.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T06-20-20.648218.parquet'
- config_name: results
data_files:
- split: 2024_01_14T06_20_20.648218
path:
- results_2024-01-14T06-20-20.648218.parquet
- split: latest
path:
- results_2024-01-14T06-20-20.648218.parquet
---
# Dataset Card for Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1](https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T06:20:20.648218](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v1/blob/main/results_2024-01-14T06-20-20.648218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6191660640057981,
"acc_stderr": 0.03263652891344978,
"acc_norm": 0.6271945727055741,
"acc_norm_stderr": 0.03333445432068468,
"mc1": 0.43329253365973075,
"mc1_stderr": 0.017347024450107492,
"mc2": 0.5997212380160826,
"mc2_stderr": 0.015696061571327326
},
"harness|arc:challenge|25": {
"acc": 0.6271331058020477,
"acc_stderr": 0.014131176760131167,
"acc_norm": 0.6604095563139932,
"acc_norm_stderr": 0.01383903976282017
},
"harness|hellaswag|10": {
"acc": 0.6580362477594105,
"acc_stderr": 0.004733980470799212,
"acc_norm": 0.8462457677753435,
"acc_norm_stderr": 0.0035997580435468044
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155243,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155243
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110932,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110932
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612907,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612907
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229962,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229962
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266196,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266196
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879716,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3865921787709497,
"acc_stderr": 0.016286674879101022,
"acc_norm": 0.3865921787709497,
"acc_norm_stderr": 0.016286674879101022
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914389,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914389
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43285528031290743,
"acc_stderr": 0.012654565234622868,
"acc_norm": 0.43285528031290743,
"acc_norm_stderr": 0.012654565234622868
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928007,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928007
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.01922832201869664,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.01922832201869664
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43329253365973075,
"mc1_stderr": 0.017347024450107492,
"mc2": 0.5997212380160826,
"mc2_stderr": 0.015696061571327326
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.011585871710209408
},
"harness|gsm8k|5": {
"acc": 0.20318423047763456,
"acc_stderr": 0.011083227665267797
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
chargoddard/Open-Platypus-Chat-Judged | ---
dataset_info:
- config_name: best_rated
features:
- name: id
dtype: string
- name: rating
struct:
- name: analysis
dtype: string
- name: judge
dtype: string
- name: score
dtype: int64
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 16455644.962765958
num_examples: 10236
download_size: 7071171
dataset_size: 16455644.962765958
- config_name: default
features:
- name: id
dtype: string
- name: rating
struct:
- name: analysis
dtype: string
- name: judge
dtype: string
- name: score
dtype: int64
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 39894811
num_examples: 24816
download_size: 18554361
dataset_size: 39894811
- config_name: worst_rated
features:
- name: id
dtype: string
- name: rating
struct:
- name: analysis
dtype: string
- name: judge
dtype: string
- name: score
dtype: int64
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 236320.80984042553
num_examples: 147
download_size: 125546
dataset_size: 236320.80984042553
configs:
- config_name: best_rated
data_files:
- split: train
path: best_rated/train-*
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: worst_rated
data_files:
- split: train
path: worst_rated/train-*
size_categories:
- 10K<n<100K
---
# Dataset Card for "Open-Platypus-Chat-Judged"
This is [Open-Platypus-Chat](https://huggingface.co/datasets/chargoddard/Open-Platypus-Chat), judged for quality by [TheBloke/OpenOrca-Platypus2-13B-GPTQ](https://huggingface.co/TheBloke/OpenOrca-Platypus2-13B-GPTQ). Each row is annotated with a score on a scale of 1 to 5 and a brief explanation of why it was given that score.
As the "judge" was a relatively quite small model, and quantized at that, the ratings are far from perfect. This is from the first iteration of an experiment in dataset refinement. Definitely do not take this dataset as ground truth.
<sub>Or do. I'm a dataset card, not a cop.</sub> |
Jeffzera/Maedokyle | ---
license: openrail
---
|
open-llm-leaderboard/details_cstr__Spaetzle-v12-7b | ---
pretty_name: Evaluation run of cstr/Spaetzle-v12-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cstr/Spaetzle-v12-7b](https://huggingface.co/cstr/Spaetzle-v12-7b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cstr__Spaetzle-v12-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T19:04:39.564454](https://huggingface.co/datasets/open-llm-leaderboard/details_cstr__Spaetzle-v12-7b/blob/main/results_2024-03-11T19-04-39.564454.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6379989493088111,\n\
\ \"acc_stderr\": 0.032444423915436886,\n \"acc_norm\": 0.6390370386048999,\n\
\ \"acc_norm_stderr\": 0.03310336332261133,\n \"mc1\": 0.4186046511627907,\n\
\ \"mc1_stderr\": 0.017270015284476855,\n \"mc2\": 0.578442759364669,\n\
\ \"mc2_stderr\": 0.015799986383599477\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6348122866894198,\n \"acc_stderr\": 0.014070265519268804,\n\
\ \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892976\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.673770165305716,\n\
\ \"acc_stderr\": 0.004678743563766658,\n \"acc_norm\": 0.8615813582951604,\n\
\ \"acc_norm_stderr\": 0.0034463307489637114\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119668,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119668\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337128,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337128\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.02552503438247489,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.02552503438247489\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.024251071262208837,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.024251071262208837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465066,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465066\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501562,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501562\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n\
\ \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.01366423099583483,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.01366423099583483\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134128,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134128\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n\
\ \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n\
\ \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.02573885479781873,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.02573885479781873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02492200116888633,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02492200116888633\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n\
\ \"acc_stderr\": 0.012736153390214963,\n \"acc_norm\": 0.4634941329856584,\n\
\ \"acc_norm_stderr\": 0.012736153390214963\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335307,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335307\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6617647058823529,\n \"acc_stderr\": 0.01913994374848704,\n \
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.01913994374848704\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4186046511627907,\n\
\ \"mc1_stderr\": 0.017270015284476855,\n \"mc2\": 0.578442759364669,\n\
\ \"mc2_stderr\": 0.015799986383599477\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625854\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6269901440485216,\n \
\ \"acc_stderr\": 0.013320876609777224\n }\n}\n```"
repo_url: https://huggingface.co/cstr/Spaetzle-v12-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|arc:challenge|25_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|gsm8k|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hellaswag|10_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-04-39.564454.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T19-04-39.564454.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- '**/details_harness|winogrande|5_2024-03-11T19-04-39.564454.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T19-04-39.564454.parquet'
- config_name: results
data_files:
- split: 2024_03_11T19_04_39.564454
path:
- results_2024-03-11T19-04-39.564454.parquet
- split: latest
path:
- results_2024-03-11T19-04-39.564454.parquet
---
# Dataset Card for Evaluation run of cstr/Spaetzle-v12-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cstr/Spaetzle-v12-7b](https://huggingface.co/cstr/Spaetzle-v12-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cstr__Spaetzle-v12-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T19:04:39.564454](https://huggingface.co/datasets/open-llm-leaderboard/details_cstr__Spaetzle-v12-7b/blob/main/results_2024-03-11T19-04-39.564454.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6379989493088111,
"acc_stderr": 0.032444423915436886,
"acc_norm": 0.6390370386048999,
"acc_norm_stderr": 0.03310336332261133,
"mc1": 0.4186046511627907,
"mc1_stderr": 0.017270015284476855,
"mc2": 0.578442759364669,
"mc2_stderr": 0.015799986383599477
},
"harness|arc:challenge|25": {
"acc": 0.6348122866894198,
"acc_stderr": 0.014070265519268804,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.013847460518892976
},
"harness|hellaswag|10": {
"acc": 0.673770165305716,
"acc_stderr": 0.004678743563766658,
"acc_norm": 0.8615813582951604,
"acc_norm_stderr": 0.0034463307489637114
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119668,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119668
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337128,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337128
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462456,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462456
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.02552503438247489,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.02552503438247489
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465066,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465066
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501562,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501562
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.01366423099583483,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.01366423099583483
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33854748603351953,
"acc_stderr": 0.01582670009648135,
"acc_norm": 0.33854748603351953,
"acc_norm_stderr": 0.01582670009648135
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.02573885479781873,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.02573885479781873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02492200116888633,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02492200116888633
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214963,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214963
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335307,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335307
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.01913994374848704,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.01913994374848704
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4186046511627907,
"mc1_stderr": 0.017270015284476855,
"mc2": 0.578442759364669,
"mc2_stderr": 0.015799986383599477
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625854
},
"harness|gsm8k|5": {
"acc": 0.6269901440485216,
"acc_stderr": 0.013320876609777224
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
aamui/neutral_slang_pairs | ---
license: apache-2.0
---
|
forgeml/viton_hd | ---
dataset_info:
features:
- name: cloth
dtype: image
- name: cloth_mask
dtype: image
- name: image
dtype: image
- name: pose
dtype: image
- name: agnostic
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 3665304052.189
num_examples: 11647
download_size: 3395826724
dataset_size: 3665304052.189
---
# Dataset Card for "viton_hd"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/openaccess-ai-collective-oo-gpt4-filtered | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1301898769.2750826
num_examples: 719045
- name: test
num_bytes: 181059428.72491744
num_examples: 100000
download_size: 846998763
dataset_size: 1482958198.0
---
# Dataset Card for "openaccess-ai-collective-oo-gpt4-filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ayush2312/Therapydataset_formatted | ---
dataset_info:
features:
- name: train
dtype: string
splits:
- name: train
num_bytes: 407954044
num_examples: 99086
download_size: 205585014
dataset_size: 407954044
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Therapydataset_formatted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thisiskeithkwan/synthetic_brease_record | ---
license: apache-2.0
dataset_info:
features:
- name: Title
dtype: string
- name: Diagnosis
dtype: string
- name: Specialty
dtype: string
- name: Categories
dtype: string
- name: Focus
dtype: string
- name: Difficulty
dtype: string
- name: Lab Tests
dtype: string
- name: Complexity
dtype: string
- name: Case Body
dtype: string
splits:
- name: train
num_bytes: 35943
num_examples: 17
download_size: 30611
dataset_size: 35943
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mankra/endless-sky-master | ---
dataset_info:
features:
- name: repo_id
dtype: string
- name: file_path
dtype: string
- name: content
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3597217
num_examples: 389
download_size: 1391669
dataset_size: 3597217
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "endless-sky-master"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ArunSamespace/airdialog-llama | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 250850241
num_examples: 321459
- name: validation
num_bytes: 31536743
num_examples: 40363
download_size: 95467251
dataset_size: 282386984
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
license: apache-2.0
task_categories:
- text-generation
- conversational
language:
- en
size_categories:
- 100K<n<1M
--- |
open-llm-leaderboard/details_TheBloke__UltraLM-13B-fp16 | ---
pretty_name: Evaluation run of TheBloke/UltraLM-13B-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/UltraLM-13B-fp16](https://huggingface.co/TheBloke/UltraLM-13B-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__UltraLM-13B-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T20:20:20.923100](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__UltraLM-13B-fp16/blob/main/results_2023-10-22T20-20-20.923100.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.01363255033557047,\n\
\ \"em_stderr\": 0.0011875381552413013,\n \"f1\": 0.08585046140939587,\n\
\ \"f1_stderr\": 0.0018748006407108256,\n \"acc\": 0.43269188767410677,\n\
\ \"acc_stderr\": 0.010269983173766185\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.01363255033557047,\n \"em_stderr\": 0.0011875381552413013,\n\
\ \"f1\": 0.08585046140939587,\n \"f1_stderr\": 0.0018748006407108256\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1068991660348749,\n \
\ \"acc_stderr\": 0.008510982565520497\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7584846093133386,\n \"acc_stderr\": 0.012028983782011875\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/UltraLM-13B-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T20_20_20.923100
path:
- '**/details_harness|drop|3_2023-10-22T20-20-20.923100.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T20-20-20.923100.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T20_20_20.923100
path:
- '**/details_harness|gsm8k|5_2023-10-22T20-20-20.923100.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T20-20-20.923100.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:33:28.322265.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:33:28.322265.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:33:28.322265.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T20_20_20.923100
path:
- '**/details_harness|winogrande|5_2023-10-22T20-20-20.923100.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T20-20-20.923100.parquet'
- config_name: results
data_files:
- split: 2023_07_19T19_33_28.322265
path:
- results_2023-07-19T19:33:28.322265.parquet
- split: 2023_10_22T20_20_20.923100
path:
- results_2023-10-22T20-20-20.923100.parquet
- split: latest
path:
- results_2023-10-22T20-20-20.923100.parquet
---
# Dataset Card for Evaluation run of TheBloke/UltraLM-13B-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/UltraLM-13B-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/UltraLM-13B-fp16](https://huggingface.co/TheBloke/UltraLM-13B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__UltraLM-13B-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T20:20:20.923100](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__UltraLM-13B-fp16/blob/main/results_2023-10-22T20-20-20.923100.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.01363255033557047,
"em_stderr": 0.0011875381552413013,
"f1": 0.08585046140939587,
"f1_stderr": 0.0018748006407108256,
"acc": 0.43269188767410677,
"acc_stderr": 0.010269983173766185
},
"harness|drop|3": {
"em": 0.01363255033557047,
"em_stderr": 0.0011875381552413013,
"f1": 0.08585046140939587,
"f1_stderr": 0.0018748006407108256
},
"harness|gsm8k|5": {
"acc": 0.1068991660348749,
"acc_stderr": 0.008510982565520497
},
"harness|winogrande|5": {
"acc": 0.7584846093133386,
"acc_stderr": 0.012028983782011875
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
wjbmattingly/ushmm-testimonies | ---
license: mit
language:
- en
tags:
- history
- holocaust
- oral testimonies
pretty_name: USHMM English Oral Testimonies Dataset
---
# Dataset Card for USHMM English Oral Testimonies Dataset
## Dataset Description
- **Homepage:** https://www.ushmm.org/collections/the-museums-collections/about/oral-history
### Dataset Summary
This is a collection of approximately 1,000 English Oral Testimonies at the United States Holocaust Memorial Museum (USHMM). The oral testimonies were collected during the late-twentieth and early twenty-first centuries. These were converted from PDFs into raw text with [Tesseract](https://github.com/tesseract-ocr/tesseract). The text was post-processed with a Python script to convert it into segments of dialogue. Because this process was automated, mistakes may remain. Occasionally, headers and footers appear in the middle of the dialogue. If found, submit an issue and these can be corrected.
This dataset was created during William J.B. Mattingly's postdoc at the Smithsonian Institution's Data Science Lab which had a cross-appointment with the USHMM. This dataset is being used for text classification, named entity recognition, and span categorization.
### Languages
These testimonies are strictly in English, but they were given by non-native speakers. This means foreign language words and phrases may appear throughout the testimonies.
## Dataset Structure
### Data Fields
- **rg:** String, the RG number used by the USHMM to identify specific items in a collection.
- **sequence:** Integer, the unique ID for the dialogue row.
- **text:** String, the actual piece of dialogue.
- **category:** String, can be a question or an answer.
### Data Splits
The dataset is not split into train, test, or validation sets.
## Dataset Creation
### Curation Rationale
The dataset was created to make the testimonies more accessible for various machine learning tasks. It is also the first publicly available dataset for Holocaust oral testimonies.
### Source Data
#### Initial Data Collection and Normalization
The initial data was collected from the United States Holocaust Memorial Museum's (USHMM) Oral Testimonies. These testimonies were converted from PDFs into raw text with Tesseract and then post-processed with a Python script to convert them into segments of dialogue.
#### Who are the source language producers?
The source language producers are the survivors of the Holocaust who shared their experiences during the Oral Testimonies collected by the USHMM.
### Personal and Sensitive Information
The dataset contains personal narratives and testimonies of Holocaust survivors which may include sensitive information.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset provides invaluable insights into the experiences of Holocaust survivors. It can aid in historical studies, and also serve as a rich resource for Natural Language Processing tasks related to understanding dialogues, emotion, sentiment, and other semantic and syntactic features of language.
### Discussion of Biases
As the dataset is based on personal testimonies, it is subjective and can contain the personal biases of the people sharing their experiences.
### Other Known Limitations
Since the testimonies were converted from PDFs into raw text using Tesseract, there may be OCR errors. Also, as the testimonies were given by non-native English speakers, there can be instances of imprecise English and foreign language words or phrases.
## Additional Information
### Dataset Curators
The dataset was curated by [William J.B. Mattingly](https://github.com/wjbmattingly).
### Licensing Information
Forthcoming
### Citation Information
USHMM Oral Testimonies Dataset. Curated by William J.B. Mattingly.
### Contributions
If you wish to contribute, please feel free to submit an issue. |
communityai/Telugu-LLM-Labs___konkani_alpaca_yahma_cleaned_filtered | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 77992027.0
num_examples: 28910
download_size: 27163209
dataset_size: 77992027.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
llm-jp/oasst1-21k-en | ---
license: apache-2.0
language:
- en
size_categories:
- 10K<n<100K
---
# oasst1-21k-en
This repository provides an instruction tuning dataset developed by [LLM-jp](https://llm-jp.nii.ac.jp/), a collaborative project launched in Japan.
This dataset is an English subset of [oasst1](https://huggingface.co/datasets/OpenAssistant/oasst1).
## Send Questions to
llm-jp(at)nii.ac.jp
## Model Card Authors
*The names are listed in alphabetical order.*
Hirokazu Kiyomaru, Hiroshi Matsuda, Jun Suzuki, Namgi Han, Saku Sugawara, Shota Sasaki, Shuhei Kurita, Taishi Nakamura, Takashi Kodama, Takumi Okamoto. |
Zaratahir123/urduprusdataset | ---
license: mit
---
|
AdapterOcean/med_alpaca_standardized_cluster_56 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 171508569
num_examples: 17138
download_size: 50509476
dataset_size: 171508569
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_56"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mrtoy/mobile-ui-design | ---
license: apache-2.0
dataset_info:
features:
- name: width
dtype: int64
- name: height
dtype: int64
- name: image
dtype: image
- name: objects
struct:
- name: bbox
sequence:
sequence: float64
- name: category
sequence: string
- name: color
list:
- name: alpha
dtype: float64
- name: blue
dtype: float64
- name: green
dtype: float64
- name: red
dtype: float64
- name: radius
sequence: float64
- name: text
sequence: string
splits:
- name: train
num_bytes: 1253458059.322
num_examples: 7846
download_size: 1160884066
dataset_size: 1253458059.322
task_categories:
- object-detection
tags:
- ui
- design
- detection
size_categories:
- n<1K
---
# Dataset: Mobile UI Design Detection
## Introduction
This dataset is designed for object detection tasks with a focus on detecting elements in mobile UI designs. The targeted objects include text, images, and groups. The dataset contains images and object detection boxes, including class labels and location information.
## Dataset Content
Load the dataset and take a look at an example:
```python
>>> from datasets import load_dataset
>>>> ds = load_dataset("mrtoy/mobile-ui-design")
>>> example = ds[0]
>>> example
{'width': 375,
'height': 667,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=375x667>,
'objects': {'bbox': [[0.0, 0.0, 375.0, 667.0],
[0.0, 0.0, 375.0, 667.0],
[0.0, 0.0, 375.0, 20.0],
...
],
'category': ['text',
'rectangle',
'rectangle',
...]}}
```
The dataset has the following fields:
- image: PIL.Image.Image object containing the image.
- height: The image height.
- width: The image width.
- objects: A dictionary containing bounding box metadata for the objects in the image:
- bbox: The object’s bounding box (xmin,ymin,width,height).
- category: The object’s category, with possible values including rectangle、text、group、image
- color: The object’s color, text color or rectangle color, or None
- radius: The object’s color, rectangle radius, or None
- text: text content, or None
You can visualize the bboxes on the image using some internal torch utilities.
```python
import torch
from torchvision.ops import box_convert
from torchvision.utils import draw_bounding_boxes
from torchvision.transforms.functional import pil_to_tensor, to_pil_image
item = ds[0]
boxes_xywh = torch.tensor(item['objects']['bbox'])
boxes_xyxy = box_convert(boxes_xywh, 'xywh', 'xyxy')
to_pil_image(
draw_bounding_boxes(
pil_to_tensor(item['image']),
boxes_xyxy,
labels=item['objects']['category'],
)
)
```



## Applications
This dataset can be used for various applications, such as:
- Training and evaluating object detection models for mobile UI designs.
- Identifying design patterns and trends to aid UI designers and developers in creating high-quality mobile app UIs.
- Enhancing the automation process in generating UI design templates.
- Improving image recognition and analysis in the field of mobile UI design.
|
distilled-from-one-sec-cv12/chunk_257 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1106951872
num_examples: 215696
download_size: 1129449696
dataset_size: 1106951872
---
# Dataset Card for "chunk_257"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
moylink/test20230914 | ---
license: openrail
---
|
ibivibiv/alpaca_tiny4 | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 461606800
num_examples: 290901
download_size: 267037456
dataset_size: 461606800
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_pszemraj__pythia-31m-simplepile-lite-2048-scratch-2e | ---
pretty_name: Evaluation run of pszemraj/pythia-31m-simplepile-lite-2048-scratch-2e
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pszemraj/pythia-31m-simplepile-lite-2048-scratch-2e](https://huggingface.co/pszemraj/pythia-31m-simplepile-lite-2048-scratch-2e)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pszemraj__pythia-31m-simplepile-lite-2048-scratch-2e\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T08:23:53.788687](https://huggingface.co/datasets/open-llm-leaderboard/details_pszemraj__pythia-31m-simplepile-lite-2048-scratch-2e/blob/main/results_2023-10-29T08-23-53.788687.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 0.013173238255033595,\n \"f1_stderr\"\
: 0.0006780799719584048,\n \"acc\": 0.2430939226519337,\n \"acc_stderr\"\
: 0.007023561458220214\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n\
\ \"em_stderr\": 0.0,\n \"f1\": 0.013173238255033595,\n \"\
f1_stderr\": 0.0006780799719584048\n },\n \"harness|gsm8k|5\": {\n \
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.4861878453038674,\n \"acc_stderr\": 0.014047122916440427\n\
\ }\n}\n```"
repo_url: https://huggingface.co/pszemraj/pythia-31m-simplepile-lite-2048-scratch-2e
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|arc:challenge|25_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T08_23_53.788687
path:
- '**/details_harness|drop|3_2023-10-29T08-23-53.788687.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T08-23-53.788687.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T08_23_53.788687
path:
- '**/details_harness|gsm8k|5_2023-10-29T08-23-53.788687.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T08-23-53.788687.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hellaswag|10_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T02-33-28.434713.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-15T02-33-28.434713.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-15T02-33-28.434713.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T08_23_53.788687
path:
- '**/details_harness|winogrande|5_2023-10-29T08-23-53.788687.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T08-23-53.788687.parquet'
- config_name: results
data_files:
- split: 2023_09_15T02_33_28.434713
path:
- results_2023-09-15T02-33-28.434713.parquet
- split: 2023_10_29T08_23_53.788687
path:
- results_2023-10-29T08-23-53.788687.parquet
- split: latest
path:
- results_2023-10-29T08-23-53.788687.parquet
---
# Dataset Card for Evaluation run of pszemraj/pythia-31m-simplepile-lite-2048-scratch-2e
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/pszemraj/pythia-31m-simplepile-lite-2048-scratch-2e
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [pszemraj/pythia-31m-simplepile-lite-2048-scratch-2e](https://huggingface.co/pszemraj/pythia-31m-simplepile-lite-2048-scratch-2e) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pszemraj__pythia-31m-simplepile-lite-2048-scratch-2e",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T08:23:53.788687](https://huggingface.co/datasets/open-llm-leaderboard/details_pszemraj__pythia-31m-simplepile-lite-2048-scratch-2e/blob/main/results_2023-10-29T08-23-53.788687.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.013173238255033595,
"f1_stderr": 0.0006780799719584048,
"acc": 0.2430939226519337,
"acc_stderr": 0.007023561458220214
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.013173238255033595,
"f1_stderr": 0.0006780799719584048
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.4861878453038674,
"acc_stderr": 0.014047122916440427
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
one-sec-cv12/chunk_74 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 24036396192.25
num_examples: 250254
download_size: 22100557621
dataset_size: 24036396192.25
---
# Dataset Card for "chunk_74"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
voxreality/vox_arta_lego_v2 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: history
sequence:
sequence: string
splits:
- name: train
num_bytes: 51242192
num_examples: 21124
- name: test
num_bytes: 12855521
num_examples: 5281
download_size: 17570384
dataset_size: 64097713
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
readerbench/ro-business-emails | ---
license: apache-2.0
dataset_info:
features:
- name: id
dtype: int64
- name: data
struct:
- name: body
dtype: string
- name: annotation
struct:
- name: choices
list:
- name: name
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 920922
num_examples: 868
- name: val
num_bytes: 273464
num_examples: 289
- name: test
num_bytes: 284370
num_examples: 290
download_size: 739445
dataset_size: 1478756
---
|
open-llm-leaderboard/details_vishesht27__22-Neuro_Model | ---
pretty_name: Evaluation run of vishesht27/22-Neuro_Model
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [vishesht27/22-Neuro_Model](https://huggingface.co/vishesht27/22-Neuro_Model)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vishesht27__22-Neuro_Model\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-10T20:10:57.394152](https://huggingface.co/datasets/open-llm-leaderboard/details_vishesht27__22-Neuro_Model/blob/main/results_2024-01-10T20-10-57.394152.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.605571197032111,\n\
\ \"acc_stderr\": 0.03282075920315952,\n \"acc_norm\": 0.6179788321266033,\n\
\ \"acc_norm_stderr\": 0.033657408374297766,\n \"mc1\": 0.37821297429620565,\n\
\ \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.6022520577190992,\n\
\ \"mc2_stderr\": 0.016271569580854295\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.46501706484641636,\n \"acc_stderr\": 0.01457558392201966,\n\
\ \"acc_norm\": 0.49146757679180886,\n \"acc_norm_stderr\": 0.014609263165632179\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4519020115514838,\n\
\ \"acc_stderr\": 0.004966640868083856,\n \"acc_norm\": 0.6230830511850229,\n\
\ \"acc_norm_stderr\": 0.0048362341436554305\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099583,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099583\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.037827289808654685,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.037827289808654685\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\"\
: 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7612903225806451,\n \"acc_stderr\": 0.02425107126220884,\n \"\
acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.02425107126220884\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.0356796977226805,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.0356796977226805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630644,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630644\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976054,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976054\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.04266416363352167,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.04266416363352167\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.027778835904935427,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.027778835904935427\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.014143970276657574,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.014143970276657574\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.02536116874968822,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.02536116874968822\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40558659217877097,\n\
\ \"acc_stderr\": 0.01642167050633919,\n \"acc_norm\": 0.40558659217877097,\n\
\ \"acc_norm_stderr\": 0.01642167050633919\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n\
\ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4491525423728814,\n\
\ \"acc_stderr\": 0.012704030518851491,\n \"acc_norm\": 0.4491525423728814,\n\
\ \"acc_norm_stderr\": 0.012704030518851491\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6535947712418301,\n \"acc_stderr\": 0.019249785691717213,\n \
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.019249785691717213\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n\
\ \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37821297429620565,\n\
\ \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.6022520577190992,\n\
\ \"mc2_stderr\": 0.016271569580854295\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.665351223362273,\n \"acc_stderr\": 0.013261823629558373\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.011372251705837756,\n \
\ \"acc_stderr\": 0.0029206661987887226\n }\n}\n```"
repo_url: https://huggingface.co/vishesht27/22-Neuro_Model
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|arc:challenge|25_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|gsm8k|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hellaswag|10_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T20-10-57.394152.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T20-10-57.394152.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- '**/details_harness|winogrande|5_2024-01-10T20-10-57.394152.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-10T20-10-57.394152.parquet'
- config_name: results
data_files:
- split: 2024_01_10T20_10_57.394152
path:
- results_2024-01-10T20-10-57.394152.parquet
- split: latest
path:
- results_2024-01-10T20-10-57.394152.parquet
---
# Dataset Card for Evaluation run of vishesht27/22-Neuro_Model
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vishesht27/22-Neuro_Model](https://huggingface.co/vishesht27/22-Neuro_Model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vishesht27__22-Neuro_Model",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T20:10:57.394152](https://huggingface.co/datasets/open-llm-leaderboard/details_vishesht27__22-Neuro_Model/blob/main/results_2024-01-10T20-10-57.394152.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.605571197032111,
"acc_stderr": 0.03282075920315952,
"acc_norm": 0.6179788321266033,
"acc_norm_stderr": 0.033657408374297766,
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.6022520577190992,
"mc2_stderr": 0.016271569580854295
},
"harness|arc:challenge|25": {
"acc": 0.46501706484641636,
"acc_stderr": 0.01457558392201966,
"acc_norm": 0.49146757679180886,
"acc_norm_stderr": 0.014609263165632179
},
"harness|hellaswag|10": {
"acc": 0.4519020115514838,
"acc_stderr": 0.004966640868083856,
"acc_norm": 0.6230830511850229,
"acc_norm_stderr": 0.0048362341436554305
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099583,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099583
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.037827289808654685,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.037827289808654685
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.0356796977226805,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.0356796977226805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630644,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630644
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976054,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976054
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.02616056824660146,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.02616056824660146
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.04266416363352167,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.04266416363352167
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.027778835904935427,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.027778835904935427
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657574,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657574
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.02536116874968822,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.02536116874968822
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40558659217877097,
"acc_stderr": 0.01642167050633919,
"acc_norm": 0.40558659217877097,
"acc_norm_stderr": 0.01642167050633919
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4491525423728814,
"acc_stderr": 0.012704030518851491,
"acc_norm": 0.4491525423728814,
"acc_norm_stderr": 0.012704030518851491
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.019249785691717213,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.019249785691717213
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.6022520577190992,
"mc2_stderr": 0.016271569580854295
},
"harness|winogrande|5": {
"acc": 0.665351223362273,
"acc_stderr": 0.013261823629558373
},
"harness|gsm8k|5": {
"acc": 0.011372251705837756,
"acc_stderr": 0.0029206661987887226
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
longevity-genie/all_pubmed | ---
license: apache-2.0
---
|
tyzhu/squad_qa_no_id_v5_full_recite_ans_sent_random_permute_rerun_4 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 6958901.936699858
num_examples: 4345
- name: validation
num_bytes: 402971
num_examples: 300
download_size: 1524500
dataset_size: 7361872.936699858
---
# Dataset Card for "squad_qa_no_id_v5_full_recite_ans_sent_random_permute_rerun_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
C-MTEB/CMedQAv1-reranking | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query
dtype: string
- name: positive
sequence: string
- name: negative
sequence: string
splits:
- name: test
num_bytes: 31879155
num_examples: 1000
download_size: 20670061
dataset_size: 31879155
---
# Dataset Card for "CMedQAv1-reranking"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pykeio/oshichats-v1-2308 | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-classification
- conversational
- text-generation
- token-classification
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
tags:
- livestream
- stream
- chat
- messages
- vtuber
- vtubers
pretty_name: OSHIChats v1
size_categories:
- 1M<n<10M
---
## OSHIChats v1 (August 2023)
OSHIChats v1 is a dataset of 8.06 million high-quality filtered English chat messages collected from various [VTuber](https://en.wikipedia.org/wiki/VTuber) live streams.
Compared to our previous dataset, [pykeio/vtuber-chats-2023-filtered-en-8.7M](https://huggingface.co/datasets/pykeio/vtuber-chats-2023-filtered-en-8.7M), we make the following improvements:
- Include stream topic information
- Far more accurate nickname detection using NLP
- Previously we did not match names like "dad" (nickname for Mori Calliope) or "mom" (nickname for Nina Kosaka) because they were too general. Now, we analyze the context and other information about the stream to determine whether to match such nicknames.
- Detect and normalize fan names like takodachi or pentomo
## Usage
Once you gain access to the dataset, you'll also need to log in to Hugging Face CLI with `huggingface-cli login`.
```py
from datasets import load_dataset
chats_dataset = load_dataset('pykeio/oshichats-v1-2308', split='train', revision='refs/convert/parquet')
chats_dataset[0]
# {'liver': 'FgXWZOUZA2oYHNr6qDmsTQ', 'stream': {'id': 'JHBv4BA_Y84', 'topic': 'Twisted_Wonderland'}, 'is_super': False, 'message': "i think i've grown to dislike them ", 'author': 'chxrry_head', 'time': [1660106235135797, 2126652]}
```
## Samples
```json
{
"liver": "kieJGn3pgJikVW8gmMXE2w",
"stream": {
"id": "dMUhbAcI5gk",
"topic": "minecraft"
},
"is_super": false,
"message": "yay <|liver:bW9t|> is streaming while I'm awake!",
"author": "Redribbon Vicky",
"time": [1651976493761550, 44936]
}
{
"liver": "yl1z3jo3XHR1riLFKG5UAg",
"stream": {
"id": "TgEX7HFqTYc",
"topic": "Donkey_Kong"
},
"is_super": false,
"message": "Stop running <|liver:QW1l|><|:ameHeh:|><|:ameHeh:|><|:ameHeh:|>",
"author": "Anon",
"time": [1616291612238864, 889273]
}
```
## Data fields
- `liver`: ID of the YouTube channel hosting the stream which the chat message came from.
- `stream`: Information about the stream.
- `id`: Video ID of the YouTube stream.
- `topic`: Topic of the stream (or `null` if a topic could not be determined). This can be things like `talk`, `Minecraft`, `Singing`, `GTA`, `Asmr`, etc.
- `is_super`: Whether or not the message is a Superchat (donation).
- `message`: Contents of the message. For consistency and ease of use on downstream tasks, we replace certain words with easily matchable special tokens:
* `<|liver:{b64}|>`: The substring refers to the host of the stream.
* `<|liver-fans:{b64}|>`: The substring refers to a nickname given to the fanbase of the host of the stream, e.g. aloupeeps or takodachis.
* `<|known-collaborator:{channelID}:{b64}|>`: The substring refers to a fellow VTuber that is present in the stream.
* `<|maybe-collaborator:{channelID}:{b64}|>`: The substring refers to a fellow VTuber that may or may not be part of the stream.
* `<|collaborator-fans:{channelID}:{b64}|>`: The substring refers to the fanbase of a collaborator present in the stream.
* `<|:{emote}:|>`: Represents a channel emote.
* Note that `channelID` is a YouTube channel ID, and `b64` is the original substring encoded as base64.
- `author`: The username of the author.
- `time`: A tuple containing the Unix timestamp of when the message was sent, and the relative time since the start of the stream.
## License
Licensed under [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/); you must give attribution, you may not use the dataset for commercial purposes, and you must distribute any transformations or copies of the dataset under the same license. [Contact us](mailto:contact@pyke.io) for alternative/commercial licensing. |
tanningpku/lichess | ---
license: apache-2.0
---
|
Darkme/SakamataChloe | ---
license: other
---
|
autoevaluate/autoeval-staging-eval-project-emotion-8f618256-13785902 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: Ahmed007/distilbert-base-uncased-finetuned-emotion
metrics: ['matthews_correlation']
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: Ahmed007/distilbert-base-uncased-finetuned-emotion
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@ahmetgunduz](https://huggingface.co/ahmetgunduz) for evaluating this model. |
giux78/20000-50000-ultrafeedback-binarized-preferences-cleaned-ita | ---
dataset_info:
features:
- name: source
dtype: string
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen-rating
dtype: float64
- name: chosen-model
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected-rating
dtype: float64
- name: rejected-model
dtype: string
splits:
- name: train
num_bytes: 197228907
num_examples: 30000
download_size: 87134816
dataset_size: 197228907
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "20000-50000-ultrafeedback-binarized-preferences-cleaned-ita"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lshowway/wikipedia.reorder.SVO | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4083836556
num_examples: 1986076
download_size: 1989232973
dataset_size: 4083836556
---
# Dataset Card for "wikipedia.reorder.SVO"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deep-learning-analytics/arxiv_small_nougat | ---
dataset:
name: arxiv_small_nougat
description: A dataset containing 108 recent papers from arXiv related to LLM (Large Language Models) and Transformers, parsed and processed using Meta's Nougat model to preserve tables and math equations.
license: [MIT]
task_categories: [Natural Language Processing, Machine Learning]
languages: [English]
size: 108 papers
download_size: [21.9MB]
---
## Dataset Description
The "arxiv_small_nougat" dataset is a collection of 108 recent papers sourced from arXiv, focusing on topics related to Large Language Models (LLM) and Transformers. These papers have been meticulously processed and parsed using Meta's Nougat model, which is specifically designed to retain the integrity of complex elements such as tables and mathematical equations.
## Data Format
The dataset contains the parsed content of the selected papers, with special attention given to the preservation of formatting, tables, and mathematical expressions. Each paper is provided as plain text.
## Usage
Researchers, academics, and natural language processing practitioners can leverage this dataset for various tasks related to LLM and Transformers, including:
- Language modeling
- Text summarization
- Information retrieval
- Table and equation extraction
## Acknowledgments
We acknowledge the arXiv platform for providing open access to a wealth of research papers in the field of machine learning and natural language processing.
## License
[mit]
---
|
facebook/winoground | ---
pretty_name: Winoground
task_categories:
- image-to-text
- text-to-image
- image-classification
extra_gated_prompt: >-
By clicking on “Access repository” below, you also agree that you are using it
solely for research purposes. The full license agreement is available in the
dataset files.
language:
- en
---
# Dataset Card for Winoground
## Dataset Description
Winoground is a novel task and dataset for evaluating the ability of vision and language models to conduct visio-linguistic compositional reasoning. Given two images and two captions, the goal is to match them correctly—but crucially, both captions contain a completely identical set of words/morphemes, only in a different order. The dataset was carefully hand-curated by expert annotators and is labeled with a rich set of fine-grained tags to assist in analyzing model performance. In our accompanying paper, we probe a diverse range of state-of-the-art vision and language models and find that, surprisingly, none of them do much better than chance. Evidently, these models are not as skilled at visio-linguistic compositional reasoning as we might have hoped. In the paper, we perform an extensive analysis to obtain insights into how future work might try to mitigate these models’ shortcomings. We aim for Winoground to serve as a useful evaluation set for advancing the state of the art and driving further progress in the field.
We are thankful to Getty Images for providing the image data.
## Data
The captions and tags are located in `data/examples.jsonl` and the images are located in `data/images.zip`. You can load the data as follows:
```python
from datasets import load_dataset
examples = load_dataset('facebook/winoground', use_auth_token=<YOUR USER ACCESS TOKEN>)
```
You can get `<YOUR USER ACCESS TOKEN>` by following these steps:
1) log into your Hugging Face account
2) click on your profile picture
3) click "Settings"
4) click "Access Tokens"
5) generate an access token
## Model Predictions and Statistics
The image-caption model scores from our paper are saved in `statistics/model_scores`. To compute many of the tables and graphs from our paper, run the following commands:
```bash
git clone https://huggingface.co/datasets/facebook/winoground
cd winoground
pip install -r statistics/requirements.txt
python statistics/compute_statistics.py
```
## FLAVA Colab notebook code for Winoground evaluation
https://colab.research.google.com/drive/1c3l4r4cEA5oXfq9uXhrJibddwRkcBxzP?usp=sharing
## CLIP Colab notebook code for Winoground evaluation
https://colab.research.google.com/drive/15wwOSte2CjTazdnCWYUm2VPlFbk2NGc0?usp=sharing
## Paper FAQ
### Why is the group score for a random model equal to 16.67%?
<details>
<summary>Click for a proof!</summary>
Intuitively, we might think that we can multiply the probabilities from the image and text score to get 1/16 = 6.25%. But, these scores are not conditionally independent. We can find the correct probability with combinatorics:
For ease of notation, let:
- a = s(c_0, i_0)
- b = s(c_1, i_0)
- c = s(c_1, i_1)
- d = s(c_0, i_1)
The group score is defined as 1 if a > b, a > d, c > b, c > d and 0 otherwise.
As one would say to GPT-3, let's think step by step:
1. There are 4! = 24 different orderings of a, c, b, d.
2. There are only 4 orderings for which a > b, a > d, c > b, c > d:
- a, c, b, d
- a, c, d, b
- c, a, b, d
- c, a, d, b
3. No ordering is any more likely than another because a, b, c, d are sampled from the same random distribution.
4. We can conclude that the probability of a group score of 1 is 4/24 = 0.166...
</details>
## Citation Information
[https://arxiv.org/abs/2204.03162](https://arxiv.org/abs/2204.03162)
Tristan Thrush and Candace Ross contributed equally.
```bibtex
@inproceedings{thrush_and_ross2022winoground,
author = {Tristan Thrush and Ryan Jiang and Max Bartolo and Amanpreet Singh and Adina Williams and Douwe Kiela and Candace Ross},
title = {Winoground: Probing vision and language models for visio-linguistic compositionality},
booktitle = {CVPR},
year = 2022,
}
``` |
open-llm-leaderboard/details_WizardLM__WizardMath-13B-V1.0 | ---
pretty_name: Evaluation run of WizardLM/WizardMath-13B-V1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [WizardLM/WizardMath-13B-V1.0](https://huggingface.co/WizardLM/WizardMath-13B-V1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WizardLM__WizardMath-13B-V1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-12T22:45:52.861079](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardMath-13B-V1.0/blob/main/results_2023-10-12T22-45-52.861079.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0024119127516778523,\n\
\ \"em_stderr\": 0.0005023380498893313,\n \"f1\": 0.07075817953020154,\n\
\ \"f1_stderr\": 0.0015254513833319102,\n \"acc\": 0.4212998893591507,\n\
\ \"acc_stderr\": 0.010848795701326375\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0024119127516778523,\n \"em_stderr\": 0.0005023380498893313,\n\
\ \"f1\": 0.07075817953020154,\n \"f1_stderr\": 0.0015254513833319102\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12357846853677028,\n \
\ \"acc_stderr\": 0.009065050306776925\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7190213101815311,\n \"acc_stderr\": 0.012632541095875825\n\
\ }\n}\n```"
repo_url: https://huggingface.co/WizardLM/WizardMath-13B-V1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_12T22_45_52.861079
path:
- '**/details_harness|drop|3_2023-10-12T22-45-52.861079.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-12T22-45-52.861079.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_12T22_45_52.861079
path:
- '**/details_harness|gsm8k|5_2023-10-12T22-45-52.861079.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-12T22-45-52.861079.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_12T22_45_52.861079
path:
- '**/details_harness|winogrande|5_2023-10-12T22-45-52.861079.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-12T22-45-52.861079.parquet'
- config_name: results
data_files:
- split: 2023_10_12T22_45_52.861079
path:
- results_2023-10-12T22-45-52.861079.parquet
- split: latest
path:
- results_2023-10-12T22-45-52.861079.parquet
---
# Dataset Card for Evaluation run of WizardLM/WizardMath-13B-V1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/WizardLM/WizardMath-13B-V1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [WizardLM/WizardMath-13B-V1.0](https://huggingface.co/WizardLM/WizardMath-13B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WizardLM__WizardMath-13B-V1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-12T22:45:52.861079](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardMath-13B-V1.0/blob/main/results_2023-10-12T22-45-52.861079.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893313,
"f1": 0.07075817953020154,
"f1_stderr": 0.0015254513833319102,
"acc": 0.4212998893591507,
"acc_stderr": 0.010848795701326375
},
"harness|drop|3": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893313,
"f1": 0.07075817953020154,
"f1_stderr": 0.0015254513833319102
},
"harness|gsm8k|5": {
"acc": 0.12357846853677028,
"acc_stderr": 0.009065050306776925
},
"harness|winogrande|5": {
"acc": 0.7190213101815311,
"acc_stderr": 0.012632541095875825
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
bellagio-ai/t2i-one-pillar-pagoda | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 10724002.0
num_examples: 27
download_size: 10667654
dataset_size: 10724002.0
---
# Dataset Card for "t2i-one-pillar-pagoda"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GamblerYu/eth_tx_cls_mini | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_stsb_definite_abstract | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 24083
num_examples: 117
- name: test
num_bytes: 13938
num_examples: 72
- name: train
num_bytes: 97504
num_examples: 482
download_size: 98295
dataset_size: 135525
---
# Dataset Card for "MULTI_VALUE_stsb_definite_abstract"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
another-symato/culturax-subset | ---
dataset_info:
features:
- name: text
dtype: string
- name: timestamp
dtype: string
- name: url
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 30156352228
num_examples: 6400728
download_size: 16013823147
dataset_size: 30156352228
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Meghdad-DTU/Resume_classification | ---
dataset_info:
features:
- name: Resume_str
dtype: string
- name: Category
dtype: string
splits:
- name: train
num_bytes: 8610060
num_examples: 1738
- name: validation
num_bytes: 1219596
num_examples: 249
- name: test
num_bytes: 2537724
num_examples: 497
download_size: 5826977
dataset_size: 12367380
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
CyberHarem/matsuo_chizuru_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of matsuo_chizuru/松尾千鶴 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of matsuo_chizuru/松尾千鶴 (THE iDOLM@STER: Cinderella Girls), containing 121 images and their tags.
The core tags of this character are `short_hair, black_hair, hair_ornament, hairclip, black_eyes, thick_eyebrows, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 121 | 96.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsuo_chizuru_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 121 | 69.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsuo_chizuru_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 272 | 141.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsuo_chizuru_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 121 | 91.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsuo_chizuru_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 272 | 174.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsuo_chizuru_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/matsuo_chizuru_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, blush, dress, open_mouth, smile, bare_shoulders, hair_bow, looking_at_viewer, white_background, choker, ribbon, simple_background, collarbone, detached_sleeves, jewelry, upper_body |
| 1 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, upper_body, long_sleeves, smile, heart, bracelet, necklace, white_shirt |
| 2 | 17 |  |  |  |  |  | 1girl, blazer, blue_jacket, white_shirt, looking_at_viewer, red_necktie, school_uniform, collared_shirt, solo, long_sleeves, simple_background, upper_body, white_background, blush, open_mouth, skirt, swept_bangs |
| 3 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, maid_headdress, solo, black_ribbon, enmaided, simple_background, waist_apron, white_apron, breasts, detached_collar, frills, open_mouth, puffy_short_sleeves, wrist_cuffs, black_skirt, grey_eyes, maid_apron, smile, white_background |
| 4 | 6 |  |  |  |  |  | kimono, looking_at_viewer, smile, 1girl, floral_print, solo, blush, calligraphy_brush, hakama_skirt, tasuki, bangs, barefoot, holding, ink |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | dress | open_mouth | smile | bare_shoulders | hair_bow | looking_at_viewer | white_background | choker | ribbon | simple_background | collarbone | detached_sleeves | jewelry | upper_body | long_sleeves | heart | bracelet | necklace | white_shirt | blazer | blue_jacket | red_necktie | school_uniform | collared_shirt | skirt | swept_bangs | maid_headdress | black_ribbon | enmaided | waist_apron | white_apron | breasts | detached_collar | frills | puffy_short_sleeves | wrist_cuffs | black_skirt | grey_eyes | maid_apron | kimono | floral_print | calligraphy_brush | hakama_skirt | tasuki | bangs | barefoot | holding | ink |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------|:-------------|:--------|:-----------------|:-----------|:--------------------|:-------------------|:---------|:---------|:--------------------|:-------------|:-------------------|:----------|:-------------|:---------------|:--------|:-----------|:-----------|:--------------|:---------|:--------------|:--------------|:-----------------|:-----------------|:--------|:--------------|:-----------------|:---------------|:-----------|:--------------|:--------------|:----------|:------------------|:---------|:----------------------|:--------------|:--------------|:------------|:-------------|:---------|:---------------|:--------------------|:---------------|:---------|:--------|:-----------|:----------|:------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | | X | | | X | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 17 |  |  |  |  |  | X | X | X | | X | | | | X | X | | | X | | | | X | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | | X | X | | | X | X | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
tigerbhai/mini-platypus-two | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/qin_liangyu_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of qin_liangyu/秦良玉/秦良玉 (Fate/Grand Order)
This is the dataset of qin_liangyu/秦良玉/秦良玉 (Fate/Grand Order), containing 387 images and their tags.
The core tags of this character are `green_eyes, hair_bun, double_bun, black_hair, breasts, sidelocks, green_ribbon, ribbon, large_breasts, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 387 | 523.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qin_liangyu_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 387 | 454.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qin_liangyu_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1004 | 866.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qin_liangyu_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/qin_liangyu_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, bun_cover, chinese_clothes, looking_at_viewer, solo, white_bodysuit, black_gloves, simple_background, blush, fingerless_gloves, white_background, covered_navel, arm_guards, hair_between_eyes, skin_tight, hair_ribbon, open_mouth, thighs |
| 1 | 8 |  |  |  |  |  | 1girl, black_gloves, bun_cover, chinese_clothes, closed_mouth, fingerless_gloves, solo, spear, white_cape, arm_guards, covered_navel, elbow_gloves, holding_weapon, looking_at_viewer, white_bodysuit, hair_between_eyes, skin_tight, smile, cloud_print, standing, thighs, blush, hair_ribbon |
| 2 | 5 |  |  |  |  |  | 1girl, arm_guards, black_gloves, bun_cover, chinese_clothes, covered_navel, fingerless_gloves, looking_at_viewer, skin_tight, solo, spear, thighs, white_cape, covered_nipples, hair_between_eyes, holding_weapon, cloud_print, open_mouth, petals, white_bodysuit |
| 3 | 5 |  |  |  |  |  | 1girl, arm_guards, black_gloves, bun_cover, chinese_clothes, covered_navel, fingerless_gloves, holding_weapon, looking_at_viewer, open_mouth, solo, spear, white_cape, fighting_stance, hair_between_eyes, skin_tight, teeth, white_bodysuit, thighs, blush, simple_background, white_background |
| 4 | 6 |  |  |  |  |  | 1girl, ass, black_gloves, bun_cover, chinese_clothes, elbow_gloves, fingerless_gloves, from_behind, holding_weapon, looking_at_viewer, looking_back, skin_tight, solo, spear, thighs, arm_guards, simple_background, white_background, white_bodysuit, standing, smile |
| 5 | 6 |  |  |  |  |  | 1girl, chinese_clothes, closed_mouth, looking_at_viewer, smile, solo, upper_body, bun_cover, hair_between_eyes, blush, bodysuit, simple_background, white_background, white_cape |
| 6 | 26 |  |  |  |  |  | 1girl, bun_cover, cleavage, green_bikini, bare_shoulders, looking_at_viewer, thighs, solo, navel, blush, open_jacket, white_jacket, long_sleeves, off_shoulder, short_shorts, black_shorts, hair_ribbon, open_mouth, short_hair, belt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bun_cover | chinese_clothes | looking_at_viewer | solo | white_bodysuit | black_gloves | simple_background | blush | fingerless_gloves | white_background | covered_navel | arm_guards | hair_between_eyes | skin_tight | hair_ribbon | open_mouth | thighs | closed_mouth | spear | white_cape | elbow_gloves | holding_weapon | smile | cloud_print | standing | covered_nipples | petals | fighting_stance | teeth | ass | from_behind | looking_back | upper_body | bodysuit | cleavage | green_bikini | bare_shoulders | navel | open_jacket | white_jacket | long_sleeves | off_shoulder | short_shorts | black_shorts | short_hair | belt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:------------------|:--------------------|:-------|:-----------------|:---------------|:--------------------|:--------|:--------------------|:-------------------|:----------------|:-------------|:--------------------|:-------------|:--------------|:-------------|:---------|:---------------|:--------|:-------------|:---------------|:-----------------|:--------|:--------------|:-----------|:------------------|:---------|:------------------|:--------|:------|:--------------|:---------------|:-------------|:-----------|:-----------|:---------------|:-----------------|:--------|:--------------|:---------------|:---------------|:---------------|:---------------|:---------------|:-------------|:-------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | X | | X | X | X | X | | X | X | | X | X | | X | | X | | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | | X | X | | X | | | | | | X | X | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | | X | | X | | | X | | X | | X | X | X | | X | | | | | X | X | X | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | X | X | X | | | X | X | | X | | | X | | | | | X | | X | | | X | | | | | | | | | | X | X | | | | | | | | | | | | |
| 6 | 26 |  |  |  |  |  | X | X | | X | X | | | | X | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
tansgken79/ken_02 | ---
license: apache-2.0
---
|
Infi-MM/InfiMM-Eval | ---
license: cc-by-nc-4.0
---
|
idiotgrape/safsfsad | ---
license: openrail
---
|
guyhadad01/manipulations | ---
dataset_info:
features:
- name: Column1
dtype: float64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 27574
num_examples: 247
- name: test
num_bytes: 7991
num_examples: 62
download_size: 23631
dataset_size: 35565
---
# Dataset Card for "manipulations"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Aeala__VicUnlocked-alpaca-30b | ---
pretty_name: Evaluation run of Aeala/VicUnlocked-alpaca-30b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Aeala/VicUnlocked-alpaca-30b](https://huggingface.co/Aeala/VicUnlocked-alpaca-30b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aeala__VicUnlocked-alpaca-30b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T18:02:20.593503](https://huggingface.co/datasets/open-llm-leaderboard/details_Aeala__VicUnlocked-alpaca-30b/blob/main/results_2023-10-17T18-02-20.593503.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.011849832214765101,\n\
\ \"em_stderr\": 0.0011081721365098474,\n \"f1\": 0.07360528523489944,\n\
\ \"f1_stderr\": 0.0016918412800750494,\n \"acc\": 0.4642427803704344,\n\
\ \"acc_stderr\": 0.010668138318862291\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.011849832214765101,\n \"em_stderr\": 0.0011081721365098474,\n\
\ \"f1\": 0.07360528523489944,\n \"f1_stderr\": 0.0016918412800750494\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1463229719484458,\n \
\ \"acc_stderr\": 0.00973521055778526\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Aeala/VicUnlocked-alpaca-30b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T18_02_20.593503
path:
- '**/details_harness|drop|3_2023-10-17T18-02-20.593503.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T18-02-20.593503.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T18_02_20.593503
path:
- '**/details_harness|gsm8k|5_2023-10-17T18-02-20.593503.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T18-02-20.593503.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T18_02_20.593503
path:
- '**/details_harness|winogrande|5_2023-10-17T18-02-20.593503.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T18-02-20.593503.parquet'
- config_name: results
data_files:
- split: 2023_10_17T18_02_20.593503
path:
- results_2023-10-17T18-02-20.593503.parquet
- split: latest
path:
- results_2023-10-17T18-02-20.593503.parquet
---
# Dataset Card for Evaluation run of Aeala/VicUnlocked-alpaca-30b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Aeala/VicUnlocked-alpaca-30b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Aeala/VicUnlocked-alpaca-30b](https://huggingface.co/Aeala/VicUnlocked-alpaca-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aeala__VicUnlocked-alpaca-30b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T18:02:20.593503](https://huggingface.co/datasets/open-llm-leaderboard/details_Aeala__VicUnlocked-alpaca-30b/blob/main/results_2023-10-17T18-02-20.593503.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.011849832214765101,
"em_stderr": 0.0011081721365098474,
"f1": 0.07360528523489944,
"f1_stderr": 0.0016918412800750494,
"acc": 0.4642427803704344,
"acc_stderr": 0.010668138318862291
},
"harness|drop|3": {
"em": 0.011849832214765101,
"em_stderr": 0.0011081721365098474,
"f1": 0.07360528523489944,
"f1_stderr": 0.0016918412800750494
},
"harness|gsm8k|5": {
"acc": 0.1463229719484458,
"acc_stderr": 0.00973521055778526
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939324
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Cripes/AHG18 | ---
license: mit
---
|
Team-PIXEL/PIXELSum_zh_wiki_for_TA | ---
license: apache-2.0
dataset_info:
features:
- name: text
struct:
- name: bytes
dtype: binary
- name: path
dtype: 'null'
- name: target
dtype: string
- name: num_text_patches
dtype: int64
splits:
- name: train
num_bytes: 103154872722
num_examples: 2555904
download_size: 102774842417
dataset_size: 103154872722
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
james-burton/OrientalMuseum_min4-3Dwhite-name | ---
dataset_info:
features:
- name: obj_num
dtype: string
- name: file
dtype: string
- name: image
dtype: image
- name: root
dtype: string
- name: description
dtype: string
- name: label
dtype:
class_label:
names:
'0': Aegis
'1': Ajaeng Holder
'2': Album Painting
'3': Amulet Mould
'4': Animal Figurine
'5': Animal Mummy
'6': Animal bone
'7': Arm Guard
'8': Axe Head
'9': Axle-caps
'10': Ball
'11': Ballista Bolt
'12': Band
'13': Basin
'14': Baton
'15': Belt Hook
'16': Betel Nut Cutter
'17': Blouse
'18': Blu-ray disc
'19': Bolt
'20': Book Cover
'21': Box
'22': Brush Pot
'23': Brush Rest
'24': Brush Tray
'25': Bulb Bowl
'26': Bullet Mould
'27': Burnisher
'28': Cabinet
'29': Cannon
'30': Cap
'31': Carved stone
'32': Case
'33': Cash Box
'34': Chest
'35': Cigar Holder
'36': Clapper
'37': Clay pipe (smoking)
'38': Comb
'39': Compass
'40': Cosmetic and Medical Equipment and Implements
'41': Cricket pot
'42': Cross-bow Lock
'43': Cup And Saucer
'44': Cup, Saucer
'45': Cushion Cover
'46': DVDs
'47': Dagger
'48': Dice Box
'49': Dice Shaker
'50': Disc
'51': Domestic Equipment and Utensils
'52': Double Dagger
'53': Dummy
'54': Ear Protector
'55': Ear Stud
'56': Earring
'57': Elephant Goad
'58': Erotic Figurine
'59': Eye Protector
'60': Ferrous object
'61': Figurine Mould
'62': Finger Ring
'63': Fitting
'64': Funerary Cone
'65': Funerary goods
'66': Funerary money
'67': Furosode
'68': Greek crosses
'69': Hand Jade
'70': Hand Protector
'71': Handwarmer
'72': Hanging
'73': Headband
'74': Heart Scarab
'75': Human Figurine
'76': Incense Holder
'77': Inkstick
'78': Kite
'79': Knee Protector
'80': Kohl Pot
'81': Kundika
'82': Leaflet
'83': Letter
'84': Lock
'85': Mah Jong Rack
'86': Majiang set
'87': Manuscript Page
'88': Massager
'89': Mat
'90': Mica Painting
'91': Miniature Painting
'92': Miniature Portrait
'93': Mortar
'94': Mould
'95': Mouth Jade
'96': Mouth Protector
'97': Mouth-piece
'98': Mummy Label
'99': Nail Protector
'100': Neck Guard
'101': Nose Protector
'102': Opium Pipe
'103': Opium Weight
'104': Oracle Bone
'105': Ostraka
'106': Palette
'107': Panel
'108': Part
'109': Pelmet
'110': Pencase
'111': Pendant
'112': Perfumer
'113': Phylactery
'114': Pigstick
'115': Pipe
'116': Pipe Case
'117': Pipe Holder
'118': Pith Painting
'119': Plaque
'120': Plate
'121': Poh Kam
'122': Pounder
'123': Prayer Wheel
'124': Rank Square
'125': Rubber
'126': Sake Cup
'127': Scabbard Chape
'128': Scabbard Slide
'129': Scarab Seal
'130': Scarf
'131': Score Board
'132': Screen
'133': Seal
'134': Seal Paste Pot
'135': Shaft Terminal
'136': Shield
'137': Shroud Weight
'138': Sleeve Band
'139': Sleeve Weight
'140': Slide
'141': Soles
'142': Spillikins
'143': Staff Head
'144': Stamp
'145': Stand
'146': Stand of Incense Burner
'147': Stem Bowl
'148': Stem Cup
'149': Story Cloth
'150': Strainer
'151': Sword Guard
'152': Table
'153': Table Runner
'154': Thangka
'155': Tomb Figure
'156': Tomb Model
'157': Washer
'158': Water Dropper
'159': Water Pot
'160': Wine Pot
'161': Woodblock Print
'162': Writing Desk
'163': accessories
'164': adzes
'165': alabastra
'166': albums
'167': altar components
'168': amphorae
'169': amulets
'170': anchors
'171': animation cels
'172': animation drawings
'173': anklets
'174': armbands
'175': armor
'176': armrests
'177': arrowheads
'178': arrows
'179': autograph albums
'180': axes
'181': 'axes: woodworking tools'
'182': back scratchers
'183': badges
'184': bags
'185': balances
'186': bandages
'187': bangles
'188': banners
'189': baskets
'190': beads
'191': beakers
'192': bedspreads
'193': bells
'194': belts
'195': bezels
'196': bi
'197': blades
'198': board games
'199': boats
'200': boilers
'201': booklets
'202': books
'203': bottles
'204': bowls
'205': boxes
'206': bracelets
'207': bread
'208': brick
'209': brooches
'210': brush washers
'211': brushes
'212': buckets
'213': buckles
'214': business cards
'215': buttons
'216': caddies
'217': calligraphy
'218': candelabras
'219': candleholders
'220': candlesticks
'221': canopic jars
'222': card cases
'223': card tables
'224': cards
'225': carvings
'226': cases
'227': celestial globes
'228': censers
'229': chains
'230': chairs
'231': charms
'232': charts
'233': chess sets
'234': chessmen
'235': chisels
'236': chopsticks
'237': cigarette cases
'238': cigarette holders
'239': cippi
'240': clamps
'241': claypipe
'242': cloth
'243': clothing
'244': coats
'245': coffins
'246': coins
'247': collar
'248': combs
'249': compact discs
'250': containers
'251': coverings
'252': covers
'253': cuffs
'254': cups
'255': cushions
'256': cylinder seals
'257': deels
'258': deity figurine
'259': diagrams
'260': dice
'261': dishes
'262': document containers
'263': documents
'264': dolls
'265': doors
'266': drawings
'267': dresses
'268': drums
'269': dung-chen
'270': earrings
'271': embroidery
'272': ensembles
'273': envelopes
'274': 'equipment for personal use: grooming, hygiene and health care'
'275': ewers
'276': fans
'277': fasteners
'278': 'feet: furniture components'
'279': female figurine
'280': fiddles
'281': figures
'282': figurines
'283': finials
'284': flagons
'285': flags
'286': flasks
'287': fragments
'288': furniture components
'289': gameboards
'290': gaming counters
'291': ge
'292': glassware
'293': gloves
'294': goblets
'295': gongs
'296': gowns
'297': greeting cards
'298': hair ornaments
'299': hairpins
'300': hammerstones
'301': handles
'302': handscrolls
'303': hanging scrolls
'304': harnesses
'305': hats
'306': headdresses
'307': headrests
'308': heads
'309': headscarves
'310': helmets
'311': hobs
'312': hoods
'313': hooks
'314': houses
'315': identity cards
'316': illuminated manuscripts
'317': incense burners
'318': incense sticks
'319': ink bottles
'320': inkstands
'321': inkstones
'322': inkwells
'323': inlays
'324': iron
'325': jackets
'326': jar seal
'327': jars
'328': jewelry
'329': juglets
'330': jugs
'331': kayagum
'332': keys
'333': kimonos
'334': knives
'335': kŏmun'gos
'336': ladles
'337': lamps
'338': lanterns
'339': lanyards
'340': leatherwork
'341': lids
'342': loom weights
'343': maces
'344': manuscripts
'345': maps
'346': maquettes
'347': masks
'348': medals
'349': miniatures
'350': mirrors
'351': miscellaneous
'352': models
'353': money
'354': mounts
'355': mugs
'356': mummies
'357': musical instruments
'358': nails
'359': necklaces
'360': needles
'361': netsukes
'362': nozzles
'363': obelisks
'364': obis
'365': oboes
'366': oil lamps
'367': ornaments
'368': pages
'369': paintings
'370': paper money
'371': paperweights
'372': papyrus
'373': passports
'374': pectorals
'375': pendants
'376': pestles
'377': petticoats
'378': photograph albums
'379': photographs
'380': pictures
'381': pins
'382': pipes
'383': pitchers
'384': plaques
'385': playing card boxes
'386': playing cards
'387': plinths
'388': plumb bobs
'389': plume holders
'390': poker
'391': pommels
'392': postage stamps
'393': postcards
'394': posters
'395': pots
'396': pottery
'397': prayers
'398': printing blocks
'399': printing plates
'400': prints
'401': punch bowls
'402': puppets
'403': purses
'404': puzzles
'405': pyxides
'406': quilts
'407': razors
'408': reliefs
'409': rifles
'410': rings
'411': robes
'412': roofing tile
'413': rosaries
'414': rose bowls
'415': rubbings
'416': rugs
'417': rulers
'418': sandals
'419': saris
'420': sarongs
'421': sashes
'422': sauceboats
'423': saucers
'424': saws
'425': scabbards
'426': scaraboids
'427': scarabs
'428': scepters
'429': scissors
'430': scrolls
'431': sculpture
'432': seed
'433': seppa
'434': shadow puppets
'435': shawls
'436': shears
'437': shell
'438': shelves
'439': sherds
'440': shields
'441': shoes
'442': shrines
'443': sistra
'444': situlae
'445': sketches
'446': skewers
'447': skirts
'448': snuff bottles
'449': socks
'450': spatulas
'451': spearheads
'452': spears
'453': spittoons
'454': spoons
'455': staples
'456': statues
'457': statuettes
'458': steelyards
'459': stelae
'460': sticks
'461': stirrup jars
'462': stools
'463': stoppers
'464': straps
'465': studs
'466': styluses
'467': sugar bowls
'468': swagger sticks
'469': swords
'470': tablets
'471': tacks
'472': talismans
'473': tallies
'474': tangrams
'475': tankards
'476': tea bowls
'477': tea caddies
'478': tea kettles
'479': teacups
'480': teapots
'481': telephones
'482': ties
'483': tiles
'484': toggles
'485': toilet caskets
'486': tools
'487': toys
'488': trays
'489': trophies
'490': trousers
'491': trumpets
'492': tubes
'493': tureens
'494': tweezers
'495': typewriters
'496': underwear
'497': unidentified
'498': urinals
'499': ushabti
'500': utensils
'501': vases
'502': veils
'503': vessels
'504': waistcoats
'505': wall tile
'506': watches
'507': weight
'508': weights
'509': whetstones
'510': whistles
'511': whorls
'512': wood blocks
'513': writing boards
- name: other_name
dtype: string
- name: material
dtype: string
- name: production.period
dtype: string
- name: production.place
dtype: string
splits:
- name: validation
num_bytes: 684989413.356
num_examples: 5454
- name: test
num_bytes: 601095885.02
num_examples: 5454
- name: train
num_bytes: 5694050884.215
num_examples: 115895
download_size: 6260089858
dataset_size: 6980136182.591
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
- split: train
path: data/train-*
---
|
presencesw/dataset_2000_decompese_question_3 | ---
dataset_info:
features:
- name: entities
sequence: 'null'
- name: triplets
list:
- name: question
dtype: string
- name: answer
dtype: string
- name: complex_question
dtype: string
splits:
- name: train
num_bytes: 70373
num_examples: 199
download_size: 27081
dataset_size: 70373
---
# Dataset Card for "dataset_2000_decompese_question_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
domrachev03/toxic_comments_subset | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: reference
dtype: string
- name: translation
dtype: string
- name: similarity
dtype: float64
- name: lenght_diff
dtype: float64
- name: ref_tox
dtype: float64
- name: trn_tox
dtype: float64
splits:
- name: train
num_bytes: 20449737.40323276
num_examples: 156516
- name: test
num_bytes: 2272236.596767238
num_examples: 17391
download_size: 17422773
dataset_size: 22721974.0
---
|
jeantimex/insightface-backup | ---
license: mit
---
Backup of the releases of https://github.com/deepinsight/insightface due to the following issues:
- https://github.com/deepinsight/insightface/issues/1896
- https://github.com/InstantID/InstantID/issues/60 |
open-llm-leaderboard/details_cognitivecomputations__TinyDolphin-2.8-1.1b | ---
pretty_name: Evaluation run of cognitivecomputations/TinyDolphin-2.8-1.1b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cognitivecomputations/TinyDolphin-2.8-1.1b](https://huggingface.co/cognitivecomputations/TinyDolphin-2.8-1.1b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cognitivecomputations__TinyDolphin-2.8-1.1b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-23T11:30:41.082288](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__TinyDolphin-2.8-1.1b/blob/main/results_2024-01-23T11-30-41.082288.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2622018497674234,\n\
\ \"acc_stderr\": 0.030893654783692482,\n \"acc_norm\": 0.26309169403239707,\n\
\ \"acc_norm_stderr\": 0.03165287942154967,\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023509,\n \"mc2\": 0.36506322642682476,\n\
\ \"mc2_stderr\": 0.014134362597043171\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.32593856655290104,\n \"acc_stderr\": 0.01369743246669324,\n\
\ \"acc_norm\": 0.3430034129692833,\n \"acc_norm_stderr\": 0.013872423223718174\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.46126269667396935,\n\
\ \"acc_stderr\": 0.004974783753309698,\n \"acc_norm\": 0.5944035052778331,\n\
\ \"acc_norm_stderr\": 0.004900036261309041\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3111111111111111,\n\
\ \"acc_stderr\": 0.039992628766177214,\n \"acc_norm\": 0.3111111111111111,\n\
\ \"acc_norm_stderr\": 0.039992628766177214\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.20754716981132076,\n \"acc_stderr\": 0.024959918028911274,\n\
\ \"acc_norm\": 0.20754716981132076,\n \"acc_norm_stderr\": 0.024959918028911274\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\
\ \"acc_stderr\": 0.03437079344106132,\n \"acc_norm\": 0.2152777777777778,\n\
\ \"acc_norm_stderr\": 0.03437079344106132\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653697,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653697\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483099,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483099\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2170212765957447,\n \"acc_stderr\": 0.026947483121496238,\n\
\ \"acc_norm\": 0.2170212765957447,\n \"acc_norm_stderr\": 0.026947483121496238\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.19298245614035087,\n\
\ \"acc_stderr\": 0.037124548537213684,\n \"acc_norm\": 0.19298245614035087,\n\
\ \"acc_norm_stderr\": 0.037124548537213684\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.02271746789770861,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.02271746789770861\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.033954900208561116,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.033954900208561116\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n\
\ \"acc_stderr\": 0.02489246917246284,\n \"acc_norm\": 0.25806451612903225,\n\
\ \"acc_norm_stderr\": 0.02489246917246284\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.0319474007226554,\n\
\ \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.0319474007226554\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365904,\n \"\
acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365904\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178256,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178256\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.22564102564102564,\n \"acc_stderr\": 0.021193632525148543,\n\
\ \"acc_norm\": 0.22564102564102564,\n \"acc_norm_stderr\": 0.021193632525148543\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.02702543349888236,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.02702543349888236\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22201834862385322,\n \"acc_stderr\": 0.01781884956479663,\n \"\
acc_norm\": 0.22201834862385322,\n \"acc_norm_stderr\": 0.01781884956479663\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145638,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145638\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.26905829596412556,\n\
\ \"acc_stderr\": 0.029763779406874972,\n \"acc_norm\": 0.26905829596412556,\n\
\ \"acc_norm_stderr\": 0.029763779406874972\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.19083969465648856,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.19083969465648856,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.36363636363636365,\n \"acc_stderr\": 0.04391326286724071,\n \"\
acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.04391326286724071\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.02860595370200425,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.02860595370200425\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n\
\ \"acc_stderr\": 0.015769984840690525,\n \"acc_norm\": 0.26436781609195403,\n\
\ \"acc_norm_stderr\": 0.015769984840690525\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2861271676300578,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.2861271676300578,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27124183006535946,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3054662379421222,\n\
\ \"acc_stderr\": 0.02616058445014049,\n \"acc_norm\": 0.3054662379421222,\n\
\ \"acc_norm_stderr\": 0.02616058445014049\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.025171041915309684,\n\
\ \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.025171041915309684\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.28368794326241137,\n \"acc_stderr\": 0.02689170942834396,\n \
\ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.02689170942834396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2737940026075619,\n\
\ \"acc_stderr\": 0.01138861216797939,\n \"acc_norm\": 0.2737940026075619,\n\
\ \"acc_norm_stderr\": 0.01138861216797939\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16544117647058823,\n \"acc_stderr\": 0.022571771025494767,\n\
\ \"acc_norm\": 0.16544117647058823,\n \"acc_norm_stderr\": 0.022571771025494767\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27941176470588236,\n \"acc_stderr\": 0.01815287105153881,\n \
\ \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.01815287105153881\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n\
\ \"acc_stderr\": 0.04309118709946459,\n \"acc_norm\": 0.2818181818181818,\n\
\ \"acc_norm_stderr\": 0.04309118709946459\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.1836734693877551,\n \"acc_stderr\": 0.024789071332007646,\n\
\ \"acc_norm\": 0.1836734693877551,\n \"acc_norm_stderr\": 0.024789071332007646\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401467,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401467\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n\
\ \"acc_stderr\": 0.0317555478662992,\n \"acc_norm\": 0.21084337349397592,\n\
\ \"acc_norm_stderr\": 0.0317555478662992\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30409356725146197,\n \"acc_stderr\": 0.03528211258245231,\n\
\ \"acc_norm\": 0.30409356725146197,\n \"acc_norm_stderr\": 0.03528211258245231\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023509,\n \"mc2\": 0.36506322642682476,\n\
\ \"mc2_stderr\": 0.014134362597043171\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6069455406471981,\n \"acc_stderr\": 0.013727276249108451\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.015163002274450341,\n \
\ \"acc_stderr\": 0.0033660229497263707\n }\n}\n```"
repo_url: https://huggingface.co/cognitivecomputations/TinyDolphin-2.8-1.1b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|arc:challenge|25_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|gsm8k|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hellaswag|10_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T11-30-41.082288.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T11-30-41.082288.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- '**/details_harness|winogrande|5_2024-01-23T11-30-41.082288.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-23T11-30-41.082288.parquet'
- config_name: results
data_files:
- split: 2024_01_23T11_30_41.082288
path:
- results_2024-01-23T11-30-41.082288.parquet
- split: latest
path:
- results_2024-01-23T11-30-41.082288.parquet
---
# Dataset Card for Evaluation run of cognitivecomputations/TinyDolphin-2.8-1.1b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cognitivecomputations/TinyDolphin-2.8-1.1b](https://huggingface.co/cognitivecomputations/TinyDolphin-2.8-1.1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cognitivecomputations__TinyDolphin-2.8-1.1b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T11:30:41.082288](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__TinyDolphin-2.8-1.1b/blob/main/results_2024-01-23T11-30-41.082288.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2622018497674234,
"acc_stderr": 0.030893654783692482,
"acc_norm": 0.26309169403239707,
"acc_norm_stderr": 0.03165287942154967,
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023509,
"mc2": 0.36506322642682476,
"mc2_stderr": 0.014134362597043171
},
"harness|arc:challenge|25": {
"acc": 0.32593856655290104,
"acc_stderr": 0.01369743246669324,
"acc_norm": 0.3430034129692833,
"acc_norm_stderr": 0.013872423223718174
},
"harness|hellaswag|10": {
"acc": 0.46126269667396935,
"acc_stderr": 0.004974783753309698,
"acc_norm": 0.5944035052778331,
"acc_norm_stderr": 0.004900036261309041
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.039992628766177214,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.039992628766177214
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.25,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.20754716981132076,
"acc_stderr": 0.024959918028911274,
"acc_norm": 0.20754716981132076,
"acc_norm_stderr": 0.024959918028911274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.03437079344106132,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.03437079344106132
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653697,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653697
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483099,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483099
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2170212765957447,
"acc_stderr": 0.026947483121496238,
"acc_norm": 0.2170212765957447,
"acc_norm_stderr": 0.026947483121496238
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.19298245614035087,
"acc_stderr": 0.037124548537213684,
"acc_norm": 0.19298245614035087,
"acc_norm_stderr": 0.037124548537213684
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.02271746789770861,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.02271746789770861
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.033954900208561116,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.033954900208561116
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.02489246917246284,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.02489246917246284
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.0319474007226554,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.0319474007226554
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365904,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365904
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178256,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178256
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.22564102564102564,
"acc_stderr": 0.021193632525148543,
"acc_norm": 0.22564102564102564,
"acc_norm_stderr": 0.021193632525148543
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.02702543349888236,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.02702543349888236
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22201834862385322,
"acc_stderr": 0.01781884956479663,
"acc_norm": 0.22201834862385322,
"acc_norm_stderr": 0.01781884956479663
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145638,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145638
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.26905829596412556,
"acc_stderr": 0.029763779406874972,
"acc_norm": 0.26905829596412556,
"acc_norm_stderr": 0.029763779406874972
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.19083969465648856,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.19083969465648856,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.04391326286724071,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.04391326286724071
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02860595370200425,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02860595370200425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.015769984840690525,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.015769984840690525
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2861271676300578,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.2861271676300578,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3054662379421222,
"acc_stderr": 0.02616058445014049,
"acc_norm": 0.3054662379421222,
"acc_norm_stderr": 0.02616058445014049
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.025171041915309684,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.025171041915309684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.02689170942834396,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.02689170942834396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2737940026075619,
"acc_stderr": 0.01138861216797939,
"acc_norm": 0.2737940026075619,
"acc_norm_stderr": 0.01138861216797939
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16544117647058823,
"acc_stderr": 0.022571771025494767,
"acc_norm": 0.16544117647058823,
"acc_norm_stderr": 0.022571771025494767
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.01815287105153881,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.01815287105153881
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.04309118709946459,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.04309118709946459
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1836734693877551,
"acc_stderr": 0.024789071332007646,
"acc_norm": 0.1836734693877551,
"acc_norm_stderr": 0.024789071332007646
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401467,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401467
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.0317555478662992,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.0317555478662992
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30409356725146197,
"acc_stderr": 0.03528211258245231,
"acc_norm": 0.30409356725146197,
"acc_norm_stderr": 0.03528211258245231
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023509,
"mc2": 0.36506322642682476,
"mc2_stderr": 0.014134362597043171
},
"harness|winogrande|5": {
"acc": 0.6069455406471981,
"acc_stderr": 0.013727276249108451
},
"harness|gsm8k|5": {
"acc": 0.015163002274450341,
"acc_stderr": 0.0033660229497263707
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
geraldng01/guanaco-llama2-200 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 338808
num_examples: 200
download_size: 0
dataset_size: 338808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BeIR/webis-touche2020-generated-queries | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
- zero-shot-retrieval
- information-retrieval
- zero-shot-information-retrieval
task_ids:
- passage-retrieval
- entity-linking-retrieval
- fact-checking-retrieval
- tweet-retrieval
- citation-prediction-retrieval
- duplication-question-retrieval
- argument-retrieval
- news-retrieval
- biomedical-information-retrieval
- question-answering-retrieval
---
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
devrunner09/compare_llama2_13B_gpt35 | ---
license: apache-2.0
---
|
Bin12345/HPC_Fortran_CPP | ---
license: mit
---
|
mteb-pt/mtop_domain | ---
configs:
- config_name: pt-br
data_files:
- split: train
path: train*
- split: validation
path: validation*
- split: test
path: test_translated*
language:
- pt
--- |
SinclairSchneider/deutsche_rezepte | ---
license: unknown
dataset_info:
features:
- name: url
dtype: string
- name: instructions
dtype: string
- name: ingredients
sequence: string
- name: day
dtype: int64
- name: name
dtype: string
- name: year
dtype: int64
- name: month
dtype: string
- name: weekday
dtype: string
splits:
- name: train
num_bytes: 15257905
num_examples: 12190
download_size: 5831122
dataset_size: 15257905
---
|
shidowake/philschmid_guanaco-sharegpt-style_split_3 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 3494574.0896712057
num_examples: 2258
download_size: 2043364
dataset_size: 3494574.0896712057
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vargr/ig_train_dataset | ---
dataset_info:
features:
- name: sid
dtype: int64
- name: sid_profile
dtype: int64
- name: shortcode
dtype: string
- name: profile_id
dtype: int64
- name: date
dtype: string
- name: post_type
dtype: int64
- name: description
dtype: string
- name: likes
dtype: int64
- name: comments
dtype: int64
- name: username
dtype: string
- name: bio
dtype: string
- name: following
dtype: int64
- name: followers
dtype: int64
- name: num_posts
dtype: int64
- name: is_business_account
dtype: bool
- name: lang
dtype: string
- name: description_category
dtype: string
- name: description_grade
dtype: float64
- name: image_grade
dtype: float64
- name: path
dtype: string
- name: image_objects
sequence: string
- name: bboxes
sequence:
sequence: float64
- name: image_dimensions
dtype: int64
- name: BalancingElements
dtype: float64
- name: ColorHarmony
dtype: float64
- name: ContentAesthetics
dtype: float64
- name: DoFScore
dtype: float64
- name: LightScore
dtype: float64
- name: MotionBlurScore
dtype: float64
- name: ObjectScore
dtype: float64
- name: RuleOfThirdsScore
dtype: float64
- name: VividColorScore
dtype: float64
- name: RepetitionScore
dtype: float64
- name: SymmetryScore
dtype: float64
- name: AestheticScore
dtype: float64
- name: image_shot
dtype: string
- name: image_category
dtype: string
splits:
- name: train
num_bytes: 458234990
num_examples: 605868
download_size: 283589825
dataset_size: 458234990
---
# Dataset Card for "ig_train_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DynamicSuperb/EnvironmentalSoundClassification_ESC50-HumanAndNonSpeechSounds | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: label
dtype: string
- name: instruction
dtype: string
splits:
- name: test
num_bytes: 88258145.5
num_examples: 200
download_size: 72132521
dataset_size: 88258145.5
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "environmental_sound_classification_human_and_non_speech_sounds_ESC50"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
darkproger/flores-uk-beams | ---
license: mit
task_categories:
- translation
language:
- uk
- en
size_categories:
- n<1K
---
This is a dataset of translation variants generated for `load_dataset("facebook/flores", "eng_Latn-ukr_Cyrl")["dev"]` using [mistralai/Mistral-7B-v0.1](https://docs.mistral.ai/self-deployment/vllm/).
Data was generated using the following script:
```python
import sys
import requests
import json
context = """[INST] They are planning to host a party next weekend. [/INST] Вони планують провести вечірку наступного вікенду.
[INST] I enjoy swimming in the ocean and feeling the salty breeze. [/INST] Мені подобається плавати в океані та відчувати солоний вітер.
[INST]"""
def prompt(input, url="http://localhost:8000/v1/completions"):
data = {
"prompt": f"{context} {input} [/INST]",
"stop": "[INST]",
"max_tokens": 512,
"temperature": 0,
#"temperature": 1.0,
#"top_p": 0.001,
#"top_k": 40,
"model": "mistralai/Mistral-7B-v0.1",
"presence_penalty": 0.1,
"use_beam_search": True,
"n": 25,
"logprobs": 1,
}
headers = {
"Content-Type": "application/json"
}
response = requests.post(url, headers=headers, data=json.dumps(data))
result = response.json()
return result
for line in sys.stdin:
text = prompt(line.strip())
print(json.dumps(text, ensure_ascii=False))
```
Quickly run vllm locally using:
```
docker run --gpus all -p 8000:8000 -e HF_HOME=/hf -e CUDA_VISIBLE_DEVICES=0 -v ~/.cache/huggingface:/hf \
ghcr.io/mistralai/mistral-src/vllm:latest --host 0.0.0.0 --model mistralai/Mistral-7B-v0.1
``` |
CyberHarem/kongou_mitsuko_toarumajutsunoindex | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kongou_mitsuko (To Aru Majutsu no Index)
This is the dataset of kongou_mitsuko (To Aru Majutsu no Index), containing 40 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
autoevaluate/autoeval-eval-tweet_eval-offensive-93ad2d-30713144953 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- tweet_eval
eval_info:
task: multi_class_classification
model: elozano/tweet_offensive_eval
metrics: ['bertscore']
dataset_name: tweet_eval
dataset_config: offensive
dataset_split: train
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: elozano/tweet_offensive_eval
* Dataset: tweet_eval
* Config: offensive
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@fabeelaalirawther@gmail.com](https://huggingface.co/fabeelaalirawther@gmail.com) for evaluating this model. |
BangumiBase/yakinbyoutou | ---
license: mit
tags:
- art
- not-for-all-audiences
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Yakin Byoutou
This is the image base of bangumi Yakin Byoutou, we detected 28 characters, 2053 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 194 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 84 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 106 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 228 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 228 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 32 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 15 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 184 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 48 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 35 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 35 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 52 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 50 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 52 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 79 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 24 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 12 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 16 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 41 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 29 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 122 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 27 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 33 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 28 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 56 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 24 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 10 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 209 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  | |
Jaspernl/common_voice_13_0_hi_pseudo_labelled | ---
dataset_info:
config_name: nl
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 887885645.796
num_examples: 31906
- name: validation
num_bytes: 355968997.37
num_examples: 10930
- name: test
num_bytes: 402843984.568
num_examples: 10936
download_size: 1643769397
dataset_size: 1646698627.734
configs:
- config_name: nl
data_files:
- split: train
path: nl/train-*
- split: validation
path: nl/validation-*
- split: test
path: nl/test-*
---
|
cgoosen/prompt_injection_ctf_dataset_2 | ---
task_categories:
- text-classification
language:
- en
tags:
- prompt injection
- bsides
- bsides cape town
pretty_name: BSIDES Cape Town 2023 CTF prompt injection dataset.
size_categories:
- n<1K
--- |
autoevaluate/autoeval-staging-eval-project-emotion-b9c02377-9905317 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: bhadresh-savani/roberta-base-emotion
metrics: []
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: bhadresh-savani/roberta-base-emotion
* Dataset: emotion
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@bhadresh-savani](https://huggingface.co/bhadresh-savani) for evaluating this model. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.