datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Mrudani16/dictation_train_dataset | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 67158760.0
num_examples: 147
download_size: 31771055
dataset_size: 67158760.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Falah/anime_art_descriptions | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 1515657
num_examples: 10000
download_size: 81702
dataset_size: 1515657
---
# Dataset Card for "anime_art_descriptions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
maja601/EuroCrops | ---
license: cc-by-4.0
---
|
open-llm-leaderboard/details_liminerity__ultra0 | ---
pretty_name: Evaluation run of liminerity/ultra0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [liminerity/ultra0](https://huggingface.co/liminerity/ultra0) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__ultra0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-16T21:00:48.786495](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__ultra0/blob/main/results_2024-02-16T21-00-48.786495.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3398058115621321,\n\
\ \"acc_stderr\": 0.033239808655417924,\n \"acc_norm\": 0.3409105453426057,\n\
\ \"acc_norm_stderr\": 0.033965935374477625,\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015027,\n \"mc2\": 0.41485075799478666,\n\
\ \"mc2_stderr\": 0.014670252998442896\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3856655290102389,\n \"acc_stderr\": 0.01422425097325718,\n\
\ \"acc_norm\": 0.41467576791808874,\n \"acc_norm_stderr\": 0.014397070564409174\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5078669587731528,\n\
\ \"acc_stderr\": 0.004989163747650774,\n \"acc_norm\": 0.6802429794861581,\n\
\ \"acc_norm_stderr\": 0.004654291661255925\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.04256193767901407,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.04256193767901407\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3584905660377358,\n \"acc_stderr\": 0.02951470358398176,\n\
\ \"acc_norm\": 0.3584905660377358,\n \"acc_norm_stderr\": 0.02951470358398176\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.3352601156069364,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.10784313725490197,\n \"acc_stderr\": 0.03086428212206013,\n\
\ \"acc_norm\": 0.10784313725490197,\n \"acc_norm_stderr\": 0.03086428212206013\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.028346963777162466,\n\
\ \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.028346963777162466\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.31724137931034485,\n \"acc_stderr\": 0.03878352372138621,\n\
\ \"acc_norm\": 0.31724137931034485,\n \"acc_norm_stderr\": 0.03878352372138621\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.0236369759961018,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.0236369759961018\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3741935483870968,\n \"acc_stderr\": 0.027528904299845787,\n \"\
acc_norm\": 0.3741935483870968,\n \"acc_norm_stderr\": 0.027528904299845787\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293752,\n \"\
acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293752\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.44242424242424244,\n \"acc_stderr\": 0.03878372113711274,\n\
\ \"acc_norm\": 0.44242424242424244,\n \"acc_norm_stderr\": 0.03878372113711274\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.45454545454545453,\n \"acc_stderr\": 0.03547601494006936,\n \"\
acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.03547601494006936\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.37305699481865284,\n \"acc_stderr\": 0.03490205592048574,\n\
\ \"acc_norm\": 0.37305699481865284,\n \"acc_norm_stderr\": 0.03490205592048574\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.26153846153846155,\n \"acc_stderr\": 0.022282141204204426,\n\
\ \"acc_norm\": 0.26153846153846155,\n \"acc_norm_stderr\": 0.022282141204204426\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.28991596638655465,\n \"acc_stderr\": 0.029472485833136084,\n\
\ \"acc_norm\": 0.28991596638655465,\n \"acc_norm_stderr\": 0.029472485833136084\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3357798165137615,\n\
\ \"acc_stderr\": 0.02024808139675293,\n \"acc_norm\": 0.3357798165137615,\n\
\ \"acc_norm_stderr\": 0.02024808139675293\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.03167468706828979,\n\
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.03167468706828979\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4068627450980392,\n \"acc_stderr\": 0.03447891136353382,\n \"\
acc_norm\": 0.4068627450980392,\n \"acc_norm_stderr\": 0.03447891136353382\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.45569620253164556,\n \"acc_stderr\": 0.03241920684693334,\n \
\ \"acc_norm\": 0.45569620253164556,\n \"acc_norm_stderr\": 0.03241920684693334\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3542600896860987,\n\
\ \"acc_stderr\": 0.03210062154134986,\n \"acc_norm\": 0.3542600896860987,\n\
\ \"acc_norm_stderr\": 0.03210062154134986\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.37404580152671757,\n \"acc_stderr\": 0.04243869242230524,\n\
\ \"acc_norm\": 0.37404580152671757,\n \"acc_norm_stderr\": 0.04243869242230524\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5206611570247934,\n \"acc_stderr\": 0.04560456086387235,\n \"\
acc_norm\": 0.5206611570247934,\n \"acc_norm_stderr\": 0.04560456086387235\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.04750077341199986,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.04750077341199986\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3374233128834356,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.3374233128834356,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.04007341809755806,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.04007341809755806\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.39805825242718446,\n \"acc_stderr\": 0.04846748253977239,\n\
\ \"acc_norm\": 0.39805825242718446,\n \"acc_norm_stderr\": 0.04846748253977239\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3717948717948718,\n\
\ \"acc_stderr\": 0.031660988918880785,\n \"acc_norm\": 0.3717948717948718,\n\
\ \"acc_norm_stderr\": 0.031660988918880785\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.45977011494252873,\n\
\ \"acc_stderr\": 0.017821994096933535,\n \"acc_norm\": 0.45977011494252873,\n\
\ \"acc_norm_stderr\": 0.017821994096933535\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.30346820809248554,\n \"acc_stderr\": 0.02475241196091722,\n\
\ \"acc_norm\": 0.30346820809248554,\n \"acc_norm_stderr\": 0.02475241196091722\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.02768418188330289,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.02768418188330289\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.40836012861736337,\n\
\ \"acc_stderr\": 0.02791705074848462,\n \"acc_norm\": 0.40836012861736337,\n\
\ \"acc_norm_stderr\": 0.02791705074848462\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.02604176620271716,\n\
\ \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.02604176620271716\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2872340425531915,\n \"acc_stderr\": 0.026992199173064356,\n \
\ \"acc_norm\": 0.2872340425531915,\n \"acc_norm_stderr\": 0.026992199173064356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2757496740547588,\n\
\ \"acc_stderr\": 0.011413813609160986,\n \"acc_norm\": 0.2757496740547588,\n\
\ \"acc_norm_stderr\": 0.011413813609160986\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3382352941176471,\n \"acc_stderr\": 0.02873932851398357,\n\
\ \"acc_norm\": 0.3382352941176471,\n \"acc_norm_stderr\": 0.02873932851398357\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.33169934640522875,\n \"acc_stderr\": 0.019047485239360378,\n \
\ \"acc_norm\": 0.33169934640522875,\n \"acc_norm_stderr\": 0.019047485239360378\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.32727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252088,\n \"acc_norm\": 0.32727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252088\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3183673469387755,\n \"acc_stderr\": 0.029822533793982073,\n\
\ \"acc_norm\": 0.3183673469387755,\n \"acc_norm_stderr\": 0.029822533793982073\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3283582089552239,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.3283582089552239,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n\
\ \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n\
\ \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4093567251461988,\n \"acc_stderr\": 0.03771283107626544,\n\
\ \"acc_norm\": 0.4093567251461988,\n \"acc_norm_stderr\": 0.03771283107626544\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015027,\n \"mc2\": 0.41485075799478666,\n\
\ \"mc2_stderr\": 0.014670252998442896\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6550907655880032,\n \"acc_stderr\": 0.01335937980503369\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1607278241091736,\n \
\ \"acc_stderr\": 0.010116708586037183\n }\n}\n```"
repo_url: https://huggingface.co/liminerity/ultra0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|arc:challenge|25_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|gsm8k|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hellaswag|10_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T21-00-48.786495.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T21-00-48.786495.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- '**/details_harness|winogrande|5_2024-02-16T21-00-48.786495.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-16T21-00-48.786495.parquet'
- config_name: results
data_files:
- split: 2024_02_16T21_00_48.786495
path:
- results_2024-02-16T21-00-48.786495.parquet
- split: latest
path:
- results_2024-02-16T21-00-48.786495.parquet
---
# Dataset Card for Evaluation run of liminerity/ultra0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/ultra0](https://huggingface.co/liminerity/ultra0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__ultra0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T21:00:48.786495](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__ultra0/blob/main/results_2024-02-16T21-00-48.786495.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3398058115621321,
"acc_stderr": 0.033239808655417924,
"acc_norm": 0.3409105453426057,
"acc_norm_stderr": 0.033965935374477625,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015027,
"mc2": 0.41485075799478666,
"mc2_stderr": 0.014670252998442896
},
"harness|arc:challenge|25": {
"acc": 0.3856655290102389,
"acc_stderr": 0.01422425097325718,
"acc_norm": 0.41467576791808874,
"acc_norm_stderr": 0.014397070564409174
},
"harness|hellaswag|10": {
"acc": 0.5078669587731528,
"acc_stderr": 0.004989163747650774,
"acc_norm": 0.6802429794861581,
"acc_norm_stderr": 0.004654291661255925
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.04256193767901407,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.04256193767901407
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3584905660377358,
"acc_stderr": 0.02951470358398176,
"acc_norm": 0.3584905660377358,
"acc_norm_stderr": 0.02951470358398176
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.10784313725490197,
"acc_stderr": 0.03086428212206013,
"acc_norm": 0.10784313725490197,
"acc_norm_stderr": 0.03086428212206013
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.251063829787234,
"acc_stderr": 0.028346963777162466,
"acc_norm": 0.251063829787234,
"acc_norm_stderr": 0.028346963777162466
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.31724137931034485,
"acc_stderr": 0.03878352372138621,
"acc_norm": 0.31724137931034485,
"acc_norm_stderr": 0.03878352372138621
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.0236369759961018,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.0236369759961018
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3741935483870968,
"acc_stderr": 0.027528904299845787,
"acc_norm": 0.3741935483870968,
"acc_norm_stderr": 0.027528904299845787
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293752,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293752
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.44242424242424244,
"acc_stderr": 0.03878372113711274,
"acc_norm": 0.44242424242424244,
"acc_norm_stderr": 0.03878372113711274
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.03547601494006936,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.03547601494006936
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37305699481865284,
"acc_stderr": 0.03490205592048574,
"acc_norm": 0.37305699481865284,
"acc_norm_stderr": 0.03490205592048574
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26153846153846155,
"acc_stderr": 0.022282141204204426,
"acc_norm": 0.26153846153846155,
"acc_norm_stderr": 0.022282141204204426
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.28991596638655465,
"acc_stderr": 0.029472485833136084,
"acc_norm": 0.28991596638655465,
"acc_norm_stderr": 0.029472485833136084
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3357798165137615,
"acc_stderr": 0.02024808139675293,
"acc_norm": 0.3357798165137615,
"acc_norm_stderr": 0.02024808139675293
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.03167468706828979,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.03167468706828979
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4068627450980392,
"acc_stderr": 0.03447891136353382,
"acc_norm": 0.4068627450980392,
"acc_norm_stderr": 0.03447891136353382
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.45569620253164556,
"acc_stderr": 0.03241920684693334,
"acc_norm": 0.45569620253164556,
"acc_norm_stderr": 0.03241920684693334
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3542600896860987,
"acc_stderr": 0.03210062154134986,
"acc_norm": 0.3542600896860987,
"acc_norm_stderr": 0.03210062154134986
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.37404580152671757,
"acc_stderr": 0.04243869242230524,
"acc_norm": 0.37404580152671757,
"acc_norm_stderr": 0.04243869242230524
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5206611570247934,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.5206611570247934,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04750077341199986,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04750077341199986
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3374233128834356,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.3374233128834356,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755806,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755806
},
"harness|hendrycksTest-management|5": {
"acc": 0.39805825242718446,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.39805825242718446,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3717948717948718,
"acc_stderr": 0.031660988918880785,
"acc_norm": 0.3717948717948718,
"acc_norm_stderr": 0.031660988918880785
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.45977011494252873,
"acc_stderr": 0.017821994096933535,
"acc_norm": 0.45977011494252873,
"acc_norm_stderr": 0.017821994096933535
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30346820809248554,
"acc_stderr": 0.02475241196091722,
"acc_norm": 0.30346820809248554,
"acc_norm_stderr": 0.02475241196091722
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.02768418188330289,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.02768418188330289
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.40836012861736337,
"acc_stderr": 0.02791705074848462,
"acc_norm": 0.40836012861736337,
"acc_norm_stderr": 0.02791705074848462
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.02604176620271716,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.02604176620271716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2872340425531915,
"acc_stderr": 0.026992199173064356,
"acc_norm": 0.2872340425531915,
"acc_norm_stderr": 0.026992199173064356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2757496740547588,
"acc_stderr": 0.011413813609160986,
"acc_norm": 0.2757496740547588,
"acc_norm_stderr": 0.011413813609160986
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3382352941176471,
"acc_stderr": 0.02873932851398357,
"acc_norm": 0.3382352941176471,
"acc_norm_stderr": 0.02873932851398357
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.33169934640522875,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.33169934640522875,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.04494290866252088,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.04494290866252088
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3183673469387755,
"acc_stderr": 0.029822533793982073,
"acc_norm": 0.3183673469387755,
"acc_norm_stderr": 0.029822533793982073
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3283582089552239,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.3283582089552239,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4093567251461988,
"acc_stderr": 0.03771283107626544,
"acc_norm": 0.4093567251461988,
"acc_norm_stderr": 0.03771283107626544
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015027,
"mc2": 0.41485075799478666,
"mc2_stderr": 0.014670252998442896
},
"harness|winogrande|5": {
"acc": 0.6550907655880032,
"acc_stderr": 0.01335937980503369
},
"harness|gsm8k|5": {
"acc": 0.1607278241091736,
"acc_stderr": 0.010116708586037183
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
NobodyExistsOnTheInternet/GiftedCoder | ---
license: mit
---
|
liuyanchen1015/MULTI_VALUE_mrpc_quotative_like | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 35799
num_examples: 127
- name: train
num_bytes: 75445
num_examples: 275
- name: validation
num_bytes: 9644
num_examples: 34
download_size: 87965
dataset_size: 120888
---
# Dataset Card for "MULTI_VALUE_mrpc_quotative_like"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Intel__neural-chat-7b-v3 | ---
pretty_name: Evaluation run of Intel/neural-chat-7b-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Intel/neural-chat-7b-v3](https://huggingface.co/Intel/neural-chat-7b-v3) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Intel__neural-chat-7b-v3_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-14T07:40:49.387630](https://huggingface.co/datasets/open-llm-leaderboard/details_Intel__neural-chat-7b-v3_public/blob/main/results_2023-11-14T07-40-49.387630.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6159787282968994,\n\
\ \"acc_stderr\": 0.03250548890211772,\n \"acc_norm\": 0.6270034478352691,\n\
\ \"acc_norm_stderr\": 0.03333659536654491,\n \"mc1\": 0.42472460220318237,\n\
\ \"mc1_stderr\": 0.01730400095716748,\n \"mc2\": 0.5876875540849609,\n\
\ \"mc2_stderr\": 0.015599384707939359,\n \"em\": 0.43435402684563756,\n\
\ \"em_stderr\": 0.005076143925092938,\n \"f1\": 0.5043477348993302,\n\
\ \"f1_stderr\": 0.004800980590603821\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6399317406143344,\n \"acc_stderr\": 0.014027516814585188,\n\
\ \"acc_norm\": 0.6715017064846417,\n \"acc_norm_stderr\": 0.0137249784655373\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6532563234415455,\n\
\ \"acc_stderr\": 0.004749606196363344,\n \"acc_norm\": 0.8329018123879706,\n\
\ \"acc_norm_stderr\": 0.0037230107458783917\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n\
\ \"acc_stderr\": 0.024137632429337717,\n \"acc_norm\": 0.7645161290322581,\n\
\ \"acc_norm_stderr\": 0.024137632429337717\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097417,\n\
\ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097417\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871937,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871937\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8165137614678899,\n \"acc_stderr\": 0.01659525971039931,\n \"\
acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.01659525971039931\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252335,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565438,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565438\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847835,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847835\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.0345727283691767,\n \"acc_norm\"\
: 0.8264462809917356,\n \"acc_norm_stderr\": 0.0345727283691767\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866225,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866225\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35083798882681566,\n\
\ \"acc_stderr\": 0.01596103667523096,\n \"acc_norm\": 0.35083798882681566,\n\
\ \"acc_norm_stderr\": 0.01596103667523096\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.02555765398186806,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.02555765398186806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n\
\ \"acc_stderr\": 0.012715404841277745,\n \"acc_norm\": 0.45371577574967403,\n\
\ \"acc_norm_stderr\": 0.012715404841277745\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6421568627450981,\n \"acc_stderr\": 0.01939305840235544,\n \
\ \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.01939305840235544\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106568,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106568\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623325,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623325\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42472460220318237,\n\
\ \"mc1_stderr\": 0.01730400095716748,\n \"mc2\": 0.5876875540849609,\n\
\ \"mc2_stderr\": 0.015599384707939359\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.011631268360607778\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.43435402684563756,\n \
\ \"em_stderr\": 0.005076143925092938,\n \"f1\": 0.5043477348993302,\n\
\ \"f1_stderr\": 0.004800980590603821\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.012130401819560273,\n \"acc_stderr\": 0.0030152942428909512\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Intel/neural-chat-7b-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|arc:challenge|25_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|drop|3_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|gsm8k|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hellaswag|10_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T07-40-49.387630.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-14T07-40-49.387630.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- '**/details_harness|winogrande|5_2023-11-14T07-40-49.387630.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-14T07-40-49.387630.parquet'
- config_name: results
data_files:
- split: 2023_11_14T07_40_49.387630
path:
- results_2023-11-14T07-40-49.387630.parquet
- split: latest
path:
- results_2023-11-14T07-40-49.387630.parquet
---
# Dataset Card for Evaluation run of Intel/neural-chat-7b-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Intel/neural-chat-7b-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Intel/neural-chat-7b-v3](https://huggingface.co/Intel/neural-chat-7b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Intel__neural-chat-7b-v3_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-14T07:40:49.387630](https://huggingface.co/datasets/open-llm-leaderboard/details_Intel__neural-chat-7b-v3_public/blob/main/results_2023-11-14T07-40-49.387630.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6159787282968994,
"acc_stderr": 0.03250548890211772,
"acc_norm": 0.6270034478352691,
"acc_norm_stderr": 0.03333659536654491,
"mc1": 0.42472460220318237,
"mc1_stderr": 0.01730400095716748,
"mc2": 0.5876875540849609,
"mc2_stderr": 0.015599384707939359,
"em": 0.43435402684563756,
"em_stderr": 0.005076143925092938,
"f1": 0.5043477348993302,
"f1_stderr": 0.004800980590603821
},
"harness|arc:challenge|25": {
"acc": 0.6399317406143344,
"acc_stderr": 0.014027516814585188,
"acc_norm": 0.6715017064846417,
"acc_norm_stderr": 0.0137249784655373
},
"harness|hellaswag|10": {
"acc": 0.6532563234415455,
"acc_stderr": 0.004749606196363344,
"acc_norm": 0.8329018123879706,
"acc_norm_stderr": 0.0037230107458783917
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.024137632429337717,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.024137632429337717
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.024603626924097417,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.024603626924097417
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871937,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871937
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.01659525971039931,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.01659525971039931
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565438,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565438
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847835,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847835
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.0345727283691767,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.0345727283691767
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866225,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866225
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35083798882681566,
"acc_stderr": 0.01596103667523096,
"acc_norm": 0.35083798882681566,
"acc_norm_stderr": 0.01596103667523096
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.02555765398186806,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.02555765398186806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45371577574967403,
"acc_stderr": 0.012715404841277745,
"acc_norm": 0.45371577574967403,
"acc_norm_stderr": 0.012715404841277745
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.625,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.625,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.01939305840235544,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.01939305840235544
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.02927956741106568,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.02927956741106568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623325,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623325
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42472460220318237,
"mc1_stderr": 0.01730400095716748,
"mc2": 0.5876875540849609,
"mc2_stderr": 0.015599384707939359
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.011631268360607778
},
"harness|drop|3": {
"em": 0.43435402684563756,
"em_stderr": 0.005076143925092938,
"f1": 0.5043477348993302,
"f1_stderr": 0.004800980590603821
},
"harness|gsm8k|5": {
"acc": 0.012130401819560273,
"acc_stderr": 0.0030152942428909512
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
shibing624/nli_zh | ---
annotations_creators:
- shibing624
language_creators:
- shibing624
language:
- zh
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 100K<n<20M
source_datasets:
- https://github.com/shibing624/text2vec
- https://github.com/IceFlameWorm/NLP_Datasets/tree/master/ATEC
- http://icrc.hitsz.edu.cn/info/1037/1162.htm
- http://icrc.hitsz.edu.cn/Article/show/171.html
- https://arxiv.org/abs/1908.11828
- https://github.com/pluto-junzeng/CNSD
task_categories:
- text-classification
task_ids:
- natural-language-inference
- semantic-similarity-scoring
- text-scoring
paperswithcode_id: snli
pretty_name: Stanford Natural Language Inference
---
# Dataset Card for NLI_zh
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [Chinese NLI dataset](https://github.com/shibing624/text2vec)
- **Leaderboard:** [NLI_zh leaderboard](https://github.com/shibing624/text2vec) (located on the homepage)
- **Size of downloaded dataset files:** 16 MB
- **Total amount of disk used:** 42 MB
### Dataset Summary
常见中文语义匹配数据集,包含[ATEC](https://github.com/IceFlameWorm/NLP_Datasets/tree/master/ATEC)、[BQ](http://icrc.hitsz.edu.cn/info/1037/1162.htm)、[LCQMC](http://icrc.hitsz.edu.cn/Article/show/171.html)、[PAWSX](https://arxiv.org/abs/1908.11828)、[STS-B](https://github.com/pluto-junzeng/CNSD)共5个任务。
数据源:
- ATEC: https://github.com/IceFlameWorm/NLP_Datasets/tree/master/ATEC
- BQ: http://icrc.hitsz.edu.cn/info/1037/1162.htm
- LCQMC: http://icrc.hitsz.edu.cn/Article/show/171.html
- PAWSX: https://arxiv.org/abs/1908.11828
- STS-B: https://github.com/pluto-junzeng/CNSD
### Supported Tasks and Leaderboards
Supported Tasks: 支持中文文本匹配任务,文本相似度计算等相关任务。
中文匹配任务的结果目前在顶会paper上出现较少,我罗列一个我自己训练的结果:
**Leaderboard:** [NLI_zh leaderboard](https://github.com/shibing624/text2vec)
### Languages
数据集均是简体中文文本。
## Dataset Structure
### Data Instances
An example of 'train' looks as follows.
```
{
"sentence1": "刘诗诗杨幂谁漂亮",
"sentence2": "刘诗诗和杨幂谁漂亮",
"label": 1,
}
{
"sentence1": "汇理财怎么样",
"sentence2": "怎么样去理财",
"label": 0,
}
```
### Data Fields
The data fields are the same among all splits.
- `sentence1`: a `string` feature.
- `sentence2`: a `string` feature.
- `label`: a classification label, with possible values including `similarity` (1), `dissimilarity` (0).
### Data Splits
#### ATEC
```shell
$ wc -l ATEC/*
20000 ATEC/ATEC.test.data
62477 ATEC/ATEC.train.data
20000 ATEC/ATEC.valid.data
102477 total
```
#### BQ
```shell
$ wc -l BQ/*
10000 BQ/BQ.test.data
100000 BQ/BQ.train.data
10000 BQ/BQ.valid.data
120000 total
```
#### LCQMC
```shell
$ wc -l LCQMC/*
12500 LCQMC/LCQMC.test.data
238766 LCQMC/LCQMC.train.data
8802 LCQMC/LCQMC.valid.data
260068 total
```
#### PAWSX
```shell
$ wc -l PAWSX/*
2000 PAWSX/PAWSX.test.data
49401 PAWSX/PAWSX.train.data
2000 PAWSX/PAWSX.valid.data
53401 total
```
#### STS-B
```shell
$ wc -l STS-B/*
1361 STS-B/STS-B.test.data
5231 STS-B/STS-B.train.data
1458 STS-B/STS-B.valid.data
8050 total
```
## Dataset Creation
### Curation Rationale
作为中文NLI(natural langauge inference)数据集,这里把这个数据集上传到huggingface的datasets,方便大家使用。
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
数据集的版权归原作者所有,使用各数据集时请尊重原数据集的版权。
BQ: Jing Chen, Qingcai Chen, Xin Liu, Haijun Yang, Daohe Lu, Buzhou Tang, The BQ Corpus: A Large-scale Domain-specific Chinese Corpus For Sentence Semantic Equivalence Identification EMNLP2018.
### Annotations
#### Annotation process
#### Who are the annotators?
原作者。
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
This dataset was developed as a benchmark for evaluating representational systems for text, especially including those induced by representation learning methods, in the task of predicting truth conditions in a given context.
Systems that are successful at such a task may be more successful in modeling semantic representations.
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
- 苏剑林对文件名称有整理
- 我上传到huggingface的datasets
### Licensing Information
用于学术研究。
The BQ corpus is free to the public for academic research.
### Contributions
Thanks to [@shibing624](https://github.com/shibing624) add this dataset.
|
hassansh/test2 | ---
configs:
- config_name: main
data_files:
- split: test
path: "0-shot/0_shot_examples_small.json"
- config_name: additional
data_files:
- split: test
path: "1-shot/1_shot_examples_small.json"
--- |
meerkat-ml/lemma-github | ---
license: apache-2.0
---
|
morteza/cogtext | ---
pretty_name: CogText PubMed Abstracts
license:
- cc-by-4.0
language:
- en
multilinguality:
- monolingual
task_categories:
- text-classification
task_ids:
- topic-classification
- semantic-similarity-classification
size_categories:
- 100K<n<1M
paperswithcode_id: linking-theories-and-methods-in-cognitive
inference: false
model-index:
- name: cogtext-pubmed
results: []
source_datasets:
- original
language_creators:
- found
- expert-generated
configs:
- config_name: abstracts (2023)
data_files: "pubmed/abstracts2023.csv.gz"
- config_name: abstracts (2021)
data_files: "pubmed/abstracts2021.csv.gz"
tags:
- Cognitive Control
- PubMed
---
# Dataset Card for CogText PubMed Abstracts
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
The **CogText** dataset is a curated collection of abstracts about cognitive tasks and constructs from PubMed.
This dataset contains the original abstracts and their corresponding embeddings.
Please visit [CogText on GitHub](https://github.com/morteza/cogtext) for the details and codes.
- **Homepage:** https://github.com/morteza/cogtext
- **Repository:** https://github.com/morteza/cogtext
- **Point of Contact:** [Morteza Ansarinia](mailto:ansarinia@me.com)
- **Paper:** https://arxiv.org/abs/2203.11016
### Dataset Summary
The 2021 dataset, collected in December 2021, contains 385,705 distinct scientific articles, featuring their title, abstract, relevant metadata, and embeddings.
The articles were specifically selected for their relevance to cognitive control constructs and associated tasks.
### Supported Tasks and Leaderboards
Topic Modeling, Text Embedding
### Languages
English
## Dataset Structure
### Data Instances
522,972 scientific articles, of which 385,705 are unique.
### Data Fields
The CSV files contain the following fields:
| Field | Description |
| ----- | ----------- |
| `index` | (int) Index of the article in the current dataset |
| `pmid` | (int) PubMed ID |
| `doi` | (str) Digital Object Identifier |
| `year` | (int) Year of publication (yyyy format)|
| `journal_title` | (str) Title of the journal |
| `journal_iso_abbreviation` | (str) ISO abbreviation of the journal |
| `title` | (str) Title of the article |
| `abstract` | (str) Abstract of the article |
| `category` | (enum) Category of the article, either "CognitiveTask" or "CognitiveConstruct" |
| `label` | (enum) Label of the article, which refers to the class labels in the `ontologies/efo.owl` ontology |
| `original_index` | (int) Index of the article in the full dataset (see `pubmed/abstracts.csv.gz`) |
### Data Splits
| Dataset | Description |
| ------- | ----------- |
| `pubmed/abstracts.csv.gz` | Full dataset |
| `pubmed/abstracts20pct.csv.gz` | 20% of the dataset (stratified random sample by `label`) |
| `gpt3/abstracts_gp3ada.nc` | GPT-3 embeddings of the entire dataset in XArray/CDF4 format, indexed by `pmid` |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Acknowledgments
This research was supported by the Luxembourg National Research Fund (ATTRACT/2016/ID/11242114/DIGILEARN and INTER Mobility/2017-2/ID/11765868/ULALA).
### Citation Information
To cite the paper use the following entry:
```
@misc{cogtext2022,
author = {Morteza Ansarinia and
Paul Schrater and
Pedro Cardoso-Leite},
title = {Linking Theories and Methods in Cognitive Sciences via Joint Embedding of the Scientific Literature: The Example of Cognitive Control},
year = {2022},
url = {https://arxiv.org/abs/2203.11016}
}
``` |
pankajmathur/orca_mini_v1_dataset | ---
license: apache-2.0
---
An Orca Style dataset, which can be used to fine tuned base models with the following prompt format.
```
### System:
<system>
### User:
<instruction>
### Assistant:
<output>
```
More details coming soon.. |
HydraLM/mix-gpt4-6k-camel-rlhf-fixed-standardized | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 51028880
num_examples: 47010
- name: test
num_bytes: 3058844
num_examples: 2716
download_size: 25724863
dataset_size: 54087724
---
# Dataset Card for "mix-gpt4-6k-camel-rlhf-fixed-standardized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_81_1713141099 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 334144
num_examples: 861
download_size: 176079
dataset_size: 334144
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gowthamrajgoku/cars_data | ---
license: apache-2.0
---
|
Asap7772/relabeled_alpacafarm_pythiasft_20K_preference_data_minlength | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: output
dtype: string
- name: text
dtype: string
- name: alpaca_text
dtype: string
- name: prompt
dtype: string
- name: alpaca_prompt
dtype: string
- name: y_ref
dtype: string
- name: y_1
dtype: string
- name: y_2
dtype: string
- name: y_w
dtype: string
- name: y_w_alpaca
dtype: string
- name: y_l
dtype: string
- name: y_l_alpaca
dtype: string
- name: y_w_score
dtype: float64
- name: y_l_score
dtype: float64
- name: score_diff
dtype: float64
splits:
- name: train
num_bytes: 177945579
num_examples: 19000
- name: test
num_bytes: 9378616
num_examples: 1000
download_size: 86093422
dataset_size: 187324195
---
# Dataset Card for "relabeled_alpacafarm_pythiasft_20K_preference_data_minlength"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_8 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1127931488
num_examples: 219784
download_size: 1149768976
dataset_size: 1127931488
---
# Dataset Card for "chunk_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TifinLab/kabyle_asr | ---
license: cc
dataset_info:
features:
- name: audio
dtype: audio
- name: Licence
dtype: string
- name: Text
dtype: string
splits:
- name: train
num_bytes: 913415347.98
num_examples: 32813
- name: test
num_bytes: 332425120.056
num_examples: 14064
download_size: 815358975
dataset_size: 1245840468.036
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
schuler/TinyStories4Pascal | ---
language:
- en
pretty_name: Tiny Stories Dataset Reprocessed for Pascal Developers
tags:
- pascal
- free pascal
- lazarus
- cai
- schuler
license: cdla-sharing-1.0
task_categories:
- text-generation
- text2text-generation
size_categories:
- 1M<n<10M
---
# Tiny Stories Dataset Reprocessed for Pascal Developers
license: cdla-sharing-1.0
This Dataset contains synthetically generated (by GPT-3.5 and GPT-4) short stories that only use a small vocabulary as described in: https://arxiv.org/abs/2305.07759.
Please reffer to https://arxiv.org/abs/2305.07759 for the original authors and to https://huggingface.co/datasets/roneneldan/TinyStories for the original dataset.
This repository contains a reprocessed version of https://huggingface.co/datasets/roneneldan/TinyStories so Pascal developers can use it. It is composed by 2 CSV files:
* A vocabulary and
* a tokenized dataset.
## Download
This dataset can be downloaded with:
```
git clone https://huggingface.co/datasets/schuler/TinyStories4Pascal
unzip TinyStories4Pascal/tinystories_tokenized_81.csv.zip
unzip TinyStories4Pascal/tinystories_vocab.csv.zip
```
## How To Use this Dataset with Free Pascal?
To be able to compile the **free pascal** source code shown below, you'll need [Lazarus](https://www.lazarus-ide.org/) and [CAI Neural API](https://github.com/joaopauloschuler/neural-api).
The model is created with (2 transformer blocks with 16 heads each):
```
csTrainingFileName = 'tinystories_tokenized_81.csv';
csVocabFileName = 'tinystories_vocab.csv';
...
FNN.AddLayer([
TNNetInput.Create(csContextLen, 1, 1),
TNNetEmbedding.Create(FVocabSize, csEmbedDim)
]);
for I := 1 to 2 do FNN.AddTransformerBlockCAI(16, csEmbedDim);
FNN.AddLayer([
TNNetFullConnectReLU.Create(csEmbedDim),
TNNetFullConnectReLU.Create(FVocabSize),
TNNetSoftMax.Create(1)
]);
```
This is the full **free pascal** source code:
```
program SimpleTransformerDic;
{$mode objfpc}{$H+}
uses {$IFDEF UNIX} {$IFDEF UseCThreads}
cthreads, {$ENDIF} {$ENDIF}
Classes,
neuralnetwork,
neuralvolume,
neuralfit,
neuralthread,
neuraldatasets,
CustApp,
Math,
sysutils;
const
csContextLen = 81;
csTrainingFileName = 'tinystories_tokenized_81.csv';
csVocabFileName = 'tinystories_vocab.csv';
csMinSampleSize = 3; // Minimum of 3 tokens.
csEmbedDim = 128;
type
TTestFitLoading = class(TCustomApplication)
protected
FDataset: array of array of integer;
FDictionary: TStringListInt;
FDatasetSize: integer;
FNN: THistoricalNets;
NFit: TNeuralDataLoadingFit;
FSampler: TNNetSamplerBase;
FMaxPredictCharPos: integer;
FVocabSize: integer;
procedure LoadDataset;
procedure DoRun; override;
public
procedure OnAfterEpoch(Sender: TObject);
procedure OnAfterStep(Sender: TObject);
procedure GetTrainingPair(Idx: integer; ThreadId: integer; pInput, pOutput: TNNetVolume);
procedure GetValidationPair(Idx: integer; ThreadId: integer; pInput, pOutput: TNNetVolume);
procedure GetTestPair(Idx: integer; ThreadId: integer; pInput, pOutput: TNNetVolume);
end;
procedure TTestFitLoading.LoadDataset;
var
Tokens: array of integer;
begin
WriteLn('Loading vocabulary: ', csVocabFileName);
FDictionary.LoadFromFile(csVocabFileName);
WriteLn('Indexing vocabulary with ', FDictionary.Count,' words.');
FDictionary.SaveCurrentPositionAndSort();
FDictionary.StringToIntegerArray('one day a', Tokens);
WriteLn('one day a: ',Tokens[0],' ',Tokens[1],' ',Tokens[2]);
WriteLn('Dic 93: ', FDictionary.IntegerToWord(93));
WriteLn('Dic 88: ', FDictionary.IntegerToWord(88));
WriteLn('Dic 40: ', FDictionary.IntegerToWord(40));
LoadIntegersInCSV(csTrainingFileName, FDataset);
FVocabSize := FDictionary.Count;
FDatasetSize := Length(FDataSet);
WriteLn('Loaded dataset with ', FDatasetSize, ' rows');
end;
procedure TTestFitLoading.DoRun;
var
W: TNNetLayer;
I: integer;
begin
FDictionary := TStringListInt.Create();
LoadDataset();
FNN := THistoricalNets.Create();
NFit := TNeuralDataLoadingFit.Create();
FMaxPredictCharPos := 81;
FSampler := TNNetSamplerTopP.Create(0.4);
FNN.AddLayer([
TNNetInput.Create(csContextLen, 1, 1),
TNNetEmbedding.Create(FVocabSize, csEmbedDim)
]);
for I := 1 to 2 do FNN.AddTransformerBlockCAI(16, csEmbedDim);
FNN.AddLayer([
TNNetFullConnectReLU.Create(csEmbedDim),
TNNetFullConnectReLU.Create(FVocabSize),
TNNetSoftMax.Create(1)
]);
DebugThreadCount();
FNN.DebugStructure;
FNN.DebugWeights();
WriteLn('Computing...');
NFit.LogEveryBatches := 100;
NFit.InitialLearningRate := 0.001;
NFit.Inertia := 0;
NFit.LearningRateDecay := 0;
NFit.L2Decay := 0;
NFit.EnableClassComparison();
NFit.EnableDefaultLoss();
NFit.AvgWeightEpochCount := 1;
NFit.OnAfterEpoch := @OnAfterEpoch;
NFit.OnAfterStep := @OnAfterStep;
NFit.FitLoading(
FNN,
{TrainingVolumesCount=}32000*3,
{ValidationVolumesCount=}32000*3 div 20,
{TestVolumesCount=}32000*3 div 20,
{batchsize=}128,
{epochs=}500,
@GetTrainingPair, @GetValidationPair, @GetTestPair
);
FNN.DebugWeights();
OnAfterEpoch(Self);
FSampler.Free;
NFit.Free;
FNN.Free;
FDictionary.Free;
Terminate;
end;
procedure TTestFitLoading.OnAfterEpoch(Sender: TObject);
begin
WriteLn(GenerateStringFromTokens(NFit.NN, FDictionary, 'one day a', nil),'.');
WriteLn(GenerateStringFromTokens(NFit.NN, FDictionary, 'lily loved to', nil),'.');
WriteLn(GenerateStringFromTokens(NFit.NN, FDictionary, 'she and he', nil),'.');
WriteLn(GenerateStringFromTokens(NFit.NN, FDictionary, 'in the park', nil),'.');
WriteLn(GenerateStringFromTokens(NFit.NN, FDictionary, 'billy', FSampler),'.');
end;
procedure TTestFitLoading.OnAfterStep(Sender: TObject);
begin
//if Random(100)=0 then OnAfterEpoch(Sender);
//NFit.ThreadNN[0].DebugWeights();
end;
procedure TTestFitLoading.GetTrainingPair(Idx: integer; ThreadId: integer;
pInput, pOutput: TNNetVolume);
var
SampleId: integer;
SampleLen: integer;
SampleCutPosition: integer;
ExpectedTokenInt: integer;
AIntegerArray: array of integer;
begin
// Make sure that expected input and output have the proper sizes.
if FNN.GetFirstLayer().Output.Size <> pInput.Size then pInput.ReSize(FNN.GetFirstLayer().Output);
if FNN.GetLastLayer().Output.Size <> pOutput.Size then pOutput.ReSize(FNN.GetLastLayer().Output);
// Get the input sample
SampleId := Random(FDatasetSize);
SampleLen := Min(Length(FDataset[SampleId]), pInput.SizeX);
SampleLen := Min(FMaxPredictCharPos, SampleLen);
SampleCutPosition := Random(SampleLen-csMinSampleSize)+csMinSampleSize-1;
// The expected token is the next character in the string
ExpectedTokenInt := FDataset[SampleId][SampleCutPosition];
// Encode the input and output volumes
AIntegerArray := Copy(FDataset[SampleId], 0, SampleCutPosition);
pInput.Fill(0);
pInput.CopyReversedNoChecksIntArr( AIntegerArray );
pOutput.SetClassForSoftMax(ExpectedTokenInt);
pOutput.Tag := ExpectedTokenInt;
end;
procedure TTestFitLoading.GetValidationPair(Idx: integer; ThreadId: integer;
pInput, pOutput: TNNetVolume);
var
SampleId: integer;
SampleLen: integer;
SampleCutPosition: integer;
ExpectedTokenInt: integer;
AIntegerArray: array of integer;
begin
// Make sure that expected input and output have the proper sizes.
if FNN.GetFirstLayer().Output.Size <> pInput.Size then pInput.ReSize(FNN.GetFirstLayer().Output);
if FNN.GetLastLayer().Output.Size <> pOutput.Size then pOutput.ReSize(FNN.GetLastLayer().Output);
// Get the input sample
SampleId := Idx;
SampleLen := Min(Length(FDataset[SampleId]), pInput.SizeX);
SampleCutPosition := (Idx mod (SampleLen-csMinSampleSize))+csMinSampleSize-1;
// The expected token is the next character in the string
ExpectedTokenInt := FDataset[SampleId][SampleCutPosition];
// Encode the input and output volumes
AIntegerArray := Copy(FDataset[SampleId], 0, SampleCutPosition);
pInput.Fill(0);
pInput.CopyReversedNoChecksIntArr( AIntegerArray );
pOutput.SetClassForSoftMax(ExpectedTokenInt);
pOutput.Tag := ExpectedTokenInt;
end;
procedure TTestFitLoading.GetTestPair(Idx: integer; ThreadId: integer;
pInput, pOutput: TNNetVolume);
begin
GetValidationPair(Idx, ThreadId, pInput, pOutput);
end;
var
Application: TTestFitLoading;
begin
Application := TTestFitLoading.Create(nil);
Application.Title:='Free Pascal Based NLP with Transformer Blocks';
Application.Run;
Application.Free;
end.
```
## How Was The Dataset Transformed From Its Original Shape?
The original dataset was transformed using with the source code shown below.
A text file was created with:
```
MIN_TRAINING_SEQ_LEN = 20
prepared_ds = []
row_cnt = 0
# for ds_row in wikids:
for ds_row in wikids:
row_cnt = row_cnt + 1
new_row = ds_row['text'].strip(" '\"")
new_row = new_row.replace(' .', '.').replace(' ,', ',').replace(' !', '!').replace(' ?', '?').replace(' ;', ';').replace(' :', ':').replace(" '", "'")
new_row = new_row.replace('<unk>', '').replace(' ', ' ')
# remove non ascii characters from new_row
new_row = ''.join([i if (ord(i) < 128) and (ord(i) > 31) else '' for i in new_row])
# remove any linefeed
new_row = new_row.replace('\n', '')
new_row_len = len(new_row)
if ( new_row_len > MIN_TRAINING_SEQ_LEN ):
prepared_ds.append(new_row)
# if row_cnt > 100000: break
if row_cnt % 100000 == 0:
print(len(prepared_ds), "loaded rows.")
print("We have", len(prepared_ds), "strings in the dataset out of a total of", row_cnt,'.')
def save_dataset(dataset, filename):
with open(filename, 'w') as f:
for item in dataset:
f.write("%s\n" % item)
save_dataset(prepared_ds,'tinystories.txt')
```
A dictionary was created:
```
MIN_TRAINING_SEQ_LEN = 6
print("Load Tiny Stories")
ds_batch_size = 64
raw_train_ds = (
tf.data.TextLineDataset("tinystories.txt")
.filter(lambda x: tf.strings.length(x) > MIN_TRAINING_SEQ_LEN)
.batch(ds_batch_size)
.shuffle(buffer_size=256)
)
# Train tokenizer vocabulary
vocab = keras_nlp.tokenizers.compute_word_piece_vocabulary(
raw_train_ds,
vocabulary_size=VOCAB_SIZE,
lowercase=True,
reserved_tokens=["[PAD]", "[UNK]", "[BOS]"],
)
np_vocab = np.array(vocab)
np.save('tinystories_vocab.npy', np_vocab)
```
The `tinystories.txt` file is tokenized with:
```
def load_text_lines(file_path):
outputlines = []
with open(file_path, 'r', encoding='utf-8') as file:
for line in file:
outputlines.append(line)
return outputlines
prepared_ds = load_text_lines('tinystories.txt')
tokenizer = keras_nlp.tokenizers.WordPieceTokenizer(
vocabulary=vocab,
sequence_length=81,
lowercase=True,
)
tokenized_ds = []
chunk_size = 10000
# prepared_ds_chunks = np.array_split(prepared_ds, len(prepared_ds)//chunk_size)
chunk_count = 0
count = 0
print('Starting to tokenize.')
start = 0
end = len(prepared_ds)
for i in range(start, end, chunk_size):
x = i
chunk = (prepared_ds[x:x+chunk_size])
tokenized_chunck = tokenizer.tokenize(chunk)
for tokenized_line in tokenized_chunck:
tokenized_ds.append(tokenized_line)
chunk_count = chunk_count + len(chunk)
count = count + 1
if count % 10 == 0:
print(i,' ',chunk_count)
print('Finished to tokenize.')
tokenized_chunck = None
chunk = None
tokenized_ds = np.array(tokenized_ds, dtype=int)
# prompt: save tokenized_ds to disk
np.save('tinystories_specific_81.npy', tokenized_ds)
np.savetxt("tinystories_vocab.csv", np_vocab, delimiter=",", fmt='%s')
np.savetxt("tinystories_tokenized_81.csv", tokenized_ds, delimiter=",", fmt='%d')
``` |
CosmicSound/autotrain-data-icons | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: icons
## Dataset Description
This dataset has been automatically processed by AutoTrain for project icons.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<512x512 RGB PIL image>",
"target": 1
},
{
"image": "<256x256 RGB PIL image>",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['cannabis', 'jars'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 3 |
| valid | 2 |
|
irds/neumarco_zh_train | ---
pretty_name: '`neumarco/zh/train`'
viewer: false
source_datasets: ['irds/neumarco_zh']
task_categories:
- text-retrieval
---
# Dataset Card for `neumarco/zh/train`
The `neumarco/zh/train` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/neumarco#neumarco/zh/train).
# Data
This dataset provides:
- `queries` (i.e., topics); count=808,731
- `qrels`: (relevance assessments); count=532,761
- `docpairs`; count=269,919,004
- For `docs`, use [`irds/neumarco_zh`](https://huggingface.co/datasets/irds/neumarco_zh)
This dataset is used by: [`neumarco_zh_train_judged`](https://huggingface.co/datasets/irds/neumarco_zh_train_judged)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/neumarco_zh_train', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/neumarco_zh_train', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
docpairs = load_dataset('irds/neumarco_zh_train', 'docpairs')
for record in docpairs:
record # {'query_id': ..., 'doc_id_a': ..., 'doc_id_b': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/72f9d0dd | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1331
dataset_size: 182
---
# Dataset Card for "72f9d0dd"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anan-2024/twitter_dataset_1713119461 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 111515
num_examples: 294
download_size: 62878
dataset_size: 111515
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Ubaidbhat/CarAccidentLaw | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: source_doc
dtype: string
- name: groundedness_score
dtype: float64
- name: groundedness_eval
dtype: string
- name: relevance_score
dtype: float64
- name: relevance_eval
dtype: string
splits:
- name: train
num_bytes: 2962
num_examples: 2
download_size: 14877
dataset_size: 2962
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/med_alpaca_standardized_cluster_15_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 29691554
num_examples: 14739
download_size: 16243581
dataset_size: 29691554
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_15_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HumanCompatibleAI/ppo-seals-HalfCheetah-v0 | ---
dataset_info:
features:
- name: obs
sequence:
sequence: float64
- name: acts
sequence:
sequence: float32
- name: infos
sequence: string
- name: terminal
dtype: bool
- name: rews
sequence: float64
splits:
- name: train
num_bytes: 89536876
num_examples: 104
download_size: 24489478
dataset_size: 89536876
---
# Dataset Card for "ppo-seals-HalfCheetah-v0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
conghao/llama2-jinyong-style | ---
task_categories:
- question-answering
language:
- zh
tags:
- medical
- art
size_categories:
- 1K<n<10K
---
### LLama2微调医学数据集
开源数据集存在与预训练模型数据集相似度较高的问题,导致微调结果不明显,效果不突出,因此就需要构建某种风格的专属数据集。
因此,构建了带**金庸创作风格**的医学数据集,可直接用于LLama2的微调使用。
微调实验结果证明,当输入医学问题时,微调模型回答的风格为**金庸创作风格**回答。
### Instruction
```
Instruction: 我睡眠不足,很头痛
```
### Output
```
少侠保重身体。江湖人称我为医仙,今日得闻君有头痛之症,实非善兆。睡眠不足,乃是身体之不适,需慎之又慎。头痛之症,或许源自于血液循环不畅,或许源于神经压力过大。当务之急,当调整生活习惯,保持良好的睡眠规律,避免过度劳累。此外,可尝试调整饮食,避免辛辣刺激之物,以免加重头痛之苦。如君仍遭此病痛,可寻求名医良药,以求解忧。</s>
```
可以发现,模型的回答已经学习到**金庸创作风格**的能力。 |
open-llm-leaderboard/details_Technoculture__MT7Bi-alpha-dpo | ---
pretty_name: Evaluation run of Technoculture/MT7Bi-alpha-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Technoculture/MT7Bi-alpha-dpo](https://huggingface.co/Technoculture/MT7Bi-alpha-dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Technoculture__MT7Bi-alpha-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T21:20:32.408861](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__MT7Bi-alpha-dpo/blob/main/results_2024-02-02T21-20-32.408861.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5253447014480443,\n\
\ \"acc_stderr\": 0.034195124118131595,\n \"acc_norm\": 0.530565322003796,\n\
\ \"acc_norm_stderr\": 0.034921847920628496,\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608767,\n \"mc2\": 0.43810210168491254,\n\
\ \"mc2_stderr\": 0.01497369498317419\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5085324232081911,\n \"acc_stderr\": 0.014609263165632186,\n\
\ \"acc_norm\": 0.5503412969283277,\n \"acc_norm_stderr\": 0.014537144444284738\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.570902210714997,\n\
\ \"acc_stderr\": 0.00493935814556132,\n \"acc_norm\": 0.7545309699263095,\n\
\ \"acc_norm_stderr\": 0.004294853999177863\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.029582245128384303,\n\
\ \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.029582245128384303\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670788,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670788\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.4913294797687861,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.032500536843658404,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.032500536843658404\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899208,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899208\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5774193548387097,\n\
\ \"acc_stderr\": 0.02810096472427264,\n \"acc_norm\": 0.5774193548387097,\n\
\ \"acc_norm_stderr\": 0.02810096472427264\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.034991131376767445,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.034991131376767445\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6616161616161617,\n \"acc_stderr\": 0.033711241426263014,\n \"\
acc_norm\": 0.6616161616161617,\n \"acc_norm_stderr\": 0.033711241426263014\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.032396370467357036,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.032396370467357036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4717948717948718,\n \"acc_stderr\": 0.025310639254933886,\n\
\ \"acc_norm\": 0.4717948717948718,\n \"acc_norm_stderr\": 0.025310639254933886\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5042016806722689,\n \"acc_stderr\": 0.03247734334448111,\n \
\ \"acc_norm\": 0.5042016806722689,\n \"acc_norm_stderr\": 0.03247734334448111\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7229357798165138,\n\
\ \"acc_stderr\": 0.01918848259016953,\n \"acc_norm\": 0.7229357798165138,\n\
\ \"acc_norm_stderr\": 0.01918848259016953\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n\
\ \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236436,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236436\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.0432076780753667,\n \"acc_norm\"\
: 0.6611570247933884,\n \"acc_norm_stderr\": 0.0432076780753667\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7692307692307693,\n\
\ \"acc_stderr\": 0.027601921381417618,\n \"acc_norm\": 0.7692307692307693,\n\
\ \"acc_norm_stderr\": 0.027601921381417618\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.722860791826309,\n\
\ \"acc_stderr\": 0.01600563629412242,\n \"acc_norm\": 0.722860791826309,\n\
\ \"acc_norm_stderr\": 0.01600563629412242\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.026424816594009845,\n\
\ \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.026424816594009845\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n\
\ \"acc_stderr\": 0.014378169884098447,\n \"acc_norm\": 0.2446927374301676,\n\
\ \"acc_norm_stderr\": 0.014378169884098447\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.02799672318063144,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.02799672318063144\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5562700964630225,\n\
\ \"acc_stderr\": 0.02821768355665231,\n \"acc_norm\": 0.5562700964630225,\n\
\ \"acc_norm_stderr\": 0.02821768355665231\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.027513747284379424,\n\
\ \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.027513747284379424\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347237,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347237\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3813559322033898,\n\
\ \"acc_stderr\": 0.012405509401888124,\n \"acc_norm\": 0.3813559322033898,\n\
\ \"acc_norm_stderr\": 0.012405509401888124\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.029624663581159703,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.029624663581159703\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5261437908496732,\n \"acc_stderr\": 0.020200164564804588,\n \
\ \"acc_norm\": 0.5261437908496732,\n \"acc_norm_stderr\": 0.020200164564804588\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.046737523336702384,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.046737523336702384\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.582089552238806,\n\
\ \"acc_stderr\": 0.034875586404620636,\n \"acc_norm\": 0.582089552238806,\n\
\ \"acc_norm_stderr\": 0.034875586404620636\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6432748538011696,\n \"acc_stderr\": 0.03674013002860954,\n\
\ \"acc_norm\": 0.6432748538011696,\n \"acc_norm_stderr\": 0.03674013002860954\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608767,\n \"mc2\": 0.43810210168491254,\n\
\ \"mc2_stderr\": 0.01497369498317419\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7103393843725335,\n \"acc_stderr\": 0.012748550807638256\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2577710386656558,\n \
\ \"acc_stderr\": 0.012048370213576593\n }\n}\n```"
repo_url: https://huggingface.co/Technoculture/MT7Bi-alpha-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|arc:challenge|25_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|gsm8k|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hellaswag|10_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T21-20-32.408861.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T21-20-32.408861.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- '**/details_harness|winogrande|5_2024-02-02T21-20-32.408861.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T21-20-32.408861.parquet'
- config_name: results
data_files:
- split: 2024_02_02T21_20_32.408861
path:
- results_2024-02-02T21-20-32.408861.parquet
- split: latest
path:
- results_2024-02-02T21-20-32.408861.parquet
---
# Dataset Card for Evaluation run of Technoculture/MT7Bi-alpha-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Technoculture/MT7Bi-alpha-dpo](https://huggingface.co/Technoculture/MT7Bi-alpha-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Technoculture__MT7Bi-alpha-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T21:20:32.408861](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__MT7Bi-alpha-dpo/blob/main/results_2024-02-02T21-20-32.408861.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5253447014480443,
"acc_stderr": 0.034195124118131595,
"acc_norm": 0.530565322003796,
"acc_norm_stderr": 0.034921847920628496,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608767,
"mc2": 0.43810210168491254,
"mc2_stderr": 0.01497369498317419
},
"harness|arc:challenge|25": {
"acc": 0.5085324232081911,
"acc_stderr": 0.014609263165632186,
"acc_norm": 0.5503412969283277,
"acc_norm_stderr": 0.014537144444284738
},
"harness|hellaswag|10": {
"acc": 0.570902210714997,
"acc_stderr": 0.00493935814556132,
"acc_norm": 0.7545309699263095,
"acc_norm_stderr": 0.004294853999177863
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6377358490566037,
"acc_stderr": 0.029582245128384303,
"acc_norm": 0.6377358490566037,
"acc_norm_stderr": 0.029582245128384303
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670788,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670788
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077615,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077615
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.032500536843658404,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.032500536843658404
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.02397386199899208,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.02397386199899208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5774193548387097,
"acc_stderr": 0.02810096472427264,
"acc_norm": 0.5774193548387097,
"acc_norm_stderr": 0.02810096472427264
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6616161616161617,
"acc_stderr": 0.033711241426263014,
"acc_norm": 0.6616161616161617,
"acc_norm_stderr": 0.033711241426263014
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.032396370467357036,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.032396370467357036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4717948717948718,
"acc_stderr": 0.025310639254933886,
"acc_norm": 0.4717948717948718,
"acc_norm_stderr": 0.025310639254933886
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5042016806722689,
"acc_stderr": 0.03247734334448111,
"acc_norm": 0.5042016806722689,
"acc_norm_stderr": 0.03247734334448111
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7229357798165138,
"acc_stderr": 0.01918848259016953,
"acc_norm": 0.7229357798165138,
"acc_norm_stderr": 0.01918848259016953
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236436,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236436
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009225,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009225
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.0432076780753667,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.0432076780753667
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7692307692307693,
"acc_stderr": 0.027601921381417618,
"acc_norm": 0.7692307692307693,
"acc_norm_stderr": 0.027601921381417618
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.722860791826309,
"acc_stderr": 0.01600563629412242,
"acc_norm": 0.722860791826309,
"acc_norm_stderr": 0.01600563629412242
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.026424816594009845,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.026424816594009845
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.014378169884098447,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.014378169884098447
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.02799672318063144,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.02799672318063144
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5562700964630225,
"acc_stderr": 0.02821768355665231,
"acc_norm": 0.5562700964630225,
"acc_norm_stderr": 0.02821768355665231
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.027513747284379424,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.027513747284379424
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347237,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347237
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3813559322033898,
"acc_stderr": 0.012405509401888124,
"acc_norm": 0.3813559322033898,
"acc_norm_stderr": 0.012405509401888124
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.029624663581159703,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.029624663581159703
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5261437908496732,
"acc_stderr": 0.020200164564804588,
"acc_norm": 0.5261437908496732,
"acc_norm_stderr": 0.020200164564804588
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.046737523336702384,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.046737523336702384
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.582089552238806,
"acc_stderr": 0.034875586404620636,
"acc_norm": 0.582089552238806,
"acc_norm_stderr": 0.034875586404620636
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6432748538011696,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.6432748538011696,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608767,
"mc2": 0.43810210168491254,
"mc2_stderr": 0.01497369498317419
},
"harness|winogrande|5": {
"acc": 0.7103393843725335,
"acc_stderr": 0.012748550807638256
},
"harness|gsm8k|5": {
"acc": 0.2577710386656558,
"acc_stderr": 0.012048370213576593
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
FSDL-Fashion/dummy_swin_pipe | ---
dataset_info:
features:
- name: path
dtype: string
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 416000000
num_examples: 100000
download_size: 420001566
dataset_size: 416000000
---
# Dataset Card for "dummy_swin_pipe"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ProjectVesuvius/test2 | ---
dataset_info:
features:
- name: messages
dtype: string
- name: images
dtype: image
splits:
- name: train
num_bytes: 471599.0
num_examples: 2
download_size: 214546
dataset_size: 471599.0
---
# Dataset Card for "test2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bartoszmaj/stance_process_pandas_df | ---
dataset_info:
features:
- name: features
dtype: string
splits:
- name: train
num_bytes: 2093931118
num_examples: 4600698
download_size: 1205094691
dataset_size: 2093931118
---
# Dataset Card for "stance_process_pandas_df"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-world_religions | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
splits:
- name: dev
num_bytes: 3518
num_examples: 5
- name: test
num_bytes: 349312
num_examples: 171
download_size: 63402
dataset_size: 352830
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-world_religions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Onno/hotel-images | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Negative
'1': Positive
splits:
- name: train
num_bytes: 80097087.0
num_examples: 297
download_size: 80097226
dataset_size: 80097087.0
---
# Dataset Card for "hotel-images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/roon_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of roon/ローン(μ兵装)/罗恩(μ兵装) (Azur Lane)
This is the dataset of roon/ローン(μ兵装)/罗恩(μ兵装) (Azur Lane), containing 500 images and their tags.
The core tags of this character are `multicolored_hair, blonde_hair, streaked_hair, red_hair, breasts, large_breasts, short_hair, hair_between_eyes, brown_eyes, bangs, headgear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 779.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/roon_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 390.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/roon_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1263 | 891.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/roon_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 662.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/roon_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1263 | 1.32 GiB | [Download](https://huggingface.co/datasets/CyberHarem/roon_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/roon_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, black_dress, cleavage, looking_at_viewer, solo, black_gloves, smile, elbow_gloves, blush, pantyhose, bare_shoulders, hair_ornament, yellow_eyes, choker |
| 1 | 12 |  |  |  |  |  | 1girl, black_dress, black_gloves, looking_at_viewer, solo, cleavage, pantyhose, clothing_cutout, elbow_gloves, microphone_stand, smile, black_corset, choker, yellow_eyes, holding, bare_shoulders, detached_sleeves, standing, two-tone_hair |
| 2 | 23 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, official_alternate_costume, black_dress, hair_ornament, looking_at_viewer, solo, smile, feather_boa, bracelet, halter_dress, black_thighhighs, sitting, tight_dress, medium_hair, ring, collarbone, two-tone_hair, parted_lips, wine_glass, grand_piano, holding_cup |
| 3 | 16 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, earrings, hair_ornament, looking_at_viewer, solo, cleavage, smile, thigh_strap, iron_cross, collarbone, armlet, sitting, weapon, yellow_eyes, blush, closed_mouth, nail_polish |
| 4 | 9 |  |  |  |  |  | 1girl, iron_cross, looking_at_viewer, smile, solo, simple_background, upper_body, white_background, black_jacket, collared_shirt, long_sleeves, grey_shirt, mechanical_hands |
| 5 | 32 |  |  |  |  |  | 1girl, solo, looking_at_viewer, smile, iron_cross, long_sleeves, black_jacket, black_skirt, shirt, pleated_skirt, buttons, mechanical_hands |
| 6 | 6 |  |  |  |  |  | 1girl, :d, black_gloves, black_jacket, black_skirt, collared_shirt, long_sleeves, open_jacket, open_mouth, white_background, iron_cross, nose_blush, pleated_skirt, solo, chibi, grey_shirt, outstretched_arms, shadow, black_footwear, full_body, heart-shaped_pupils, looking_at_viewer, simple_background, standing |
| 7 | 8 |  |  |  |  |  | 1girl, black_shirt, red_necktie, short_sleeves, solo, simple_background, white_background, black_pantyhose, looking_at_viewer, pleated_skirt, smile, black_skirt, iron_cross, black_gloves, buttons, official_alternate_costume, blush, brown_belt, feet_out_of_frame, fingerless_gloves, two-tone_hair, yellow_eyes |
| 8 | 13 |  |  |  |  |  | 1girl, looking_at_viewer, solo, navel, smile, two-tone_hair, cleavage, yellow_eyes, black_bra, black_panties, blush, simple_background, underwear_only, white_background |
| 9 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, solo, black_bikini, cleavage, outdoors, smile, navel, blue_sky, collarbone, day, ocean, wet, bare_shoulders, blush, cloud, water, beach, frilled_bikini, see-through, closed_mouth, hair_flower, iron_cross, open_mouth, standing |
| 10 | 15 |  |  |  |  |  | 1boy, 1girl, hetero, solo_focus, nipples, penis, sex, vaginal, blush, open_mouth, girl_on_top, cowgirl_position, navel, looking_at_viewer, nude, sweat, cum_in_pussy, pov, censored, collarbone, smile |
| 11 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, wide_sleeves, holding, obi, yellow_eyes, black_kimono, hair_flower, open_mouth, simple_background, standing, two-tone_hair, white_background, alternate_costume, fur-trimmed_kimono, jewelry, long_sleeves, omikuji |
| 12 | 5 |  |  |  |  |  | 1girl, black_leotard, cleavage, fake_animal_ears, playboy_bunny, rabbit_ears, strapless_leotard, wrist_cuffs, detached_collar, looking_at_viewer, smile, solo, bare_shoulders, blush, yellow_eyes, alternate_costume, black_pantyhose, bowtie, light_brown_hair, medium_hair, nail_polish, rabbit_tail, simple_background, two-tone_hair |
| 13 | 6 |  |  |  |  |  | black_shirt, bottomless, 1girl, blush, camera, holding, smile, t-shirt, ass, clothes_writing, open_mouth, short_sleeves, solo_focus, 2girls |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_dress | cleavage | looking_at_viewer | solo | black_gloves | smile | elbow_gloves | blush | pantyhose | bare_shoulders | hair_ornament | yellow_eyes | choker | clothing_cutout | microphone_stand | black_corset | holding | detached_sleeves | standing | two-tone_hair | official_alternate_costume | feather_boa | bracelet | halter_dress | black_thighhighs | sitting | tight_dress | medium_hair | ring | collarbone | parted_lips | wine_glass | grand_piano | holding_cup | earrings | thigh_strap | iron_cross | armlet | weapon | closed_mouth | nail_polish | simple_background | upper_body | white_background | black_jacket | collared_shirt | long_sleeves | grey_shirt | mechanical_hands | black_skirt | shirt | pleated_skirt | buttons | :d | open_jacket | open_mouth | nose_blush | chibi | outstretched_arms | shadow | black_footwear | full_body | heart-shaped_pupils | black_shirt | red_necktie | short_sleeves | black_pantyhose | brown_belt | feet_out_of_frame | fingerless_gloves | navel | black_bra | black_panties | underwear_only | black_bikini | outdoors | blue_sky | day | ocean | wet | cloud | water | beach | frilled_bikini | see-through | hair_flower | 1boy | hetero | solo_focus | nipples | penis | sex | vaginal | girl_on_top | cowgirl_position | nude | sweat | cum_in_pussy | pov | censored | wide_sleeves | obi | black_kimono | alternate_costume | fur-trimmed_kimono | jewelry | omikuji | black_leotard | fake_animal_ears | playboy_bunny | rabbit_ears | strapless_leotard | wrist_cuffs | detached_collar | bowtie | light_brown_hair | rabbit_tail | bottomless | camera | t-shirt | ass | clothes_writing | 2girls |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------------|:-----------|:--------------------|:-------|:---------------|:--------|:---------------|:--------|:------------|:-----------------|:----------------|:--------------|:---------|:------------------|:-------------------|:---------------|:----------|:-------------------|:-----------|:----------------|:-----------------------------|:--------------|:-----------|:---------------|:-------------------|:----------|:--------------|:--------------|:-------|:-------------|:--------------|:-------------|:--------------|:--------------|:-----------|:--------------|:-------------|:---------|:---------|:---------------|:--------------|:--------------------|:-------------|:-------------------|:---------------|:-----------------|:---------------|:-------------|:-------------------|:--------------|:--------|:----------------|:----------|:-----|:--------------|:-------------|:-------------|:--------|:--------------------|:---------|:-----------------|:------------|:----------------------|:--------------|:--------------|:----------------|:------------------|:-------------|:--------------------|:--------------------|:--------|:------------|:----------------|:-----------------|:---------------|:-----------|:-----------|:------|:--------|:------|:--------|:--------|:--------|:-----------------|:--------------|:--------------|:-------|:---------|:-------------|:----------|:--------|:------|:----------|:--------------|:-------------------|:-------|:--------|:---------------|:------|:-----------|:---------------|:------|:---------------|:--------------------|:---------------------|:----------|:----------|:----------------|:-------------------|:----------------|:--------------|:--------------------|:--------------|:------------------|:---------|:-------------------|:--------------|:-------------|:---------|:----------|:------|:------------------|:---------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 23 |  |  |  |  |  | X | X | X | X | X | | X | | | | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 16 |  |  |  |  |  | X | X | X | X | X | | X | | X | | X | X | X | | | | | | | | | | | | | | X | | | | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 32 |  |  |  |  |  | X | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | X | X | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | X | | X | X | X | X | X | | X | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 8 |  |  |  |  |  | X | | | X | X | X | X | | X | | | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | X | | | | | X | | X | | | | | | X | | X | X | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 13 |  |  |  |  |  | X | | X | X | X | | X | | X | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 10 |  |  |  |  |  | X | | X | X | X | | X | | X | | X | | | | | | | | | X | | | | | | | | | | | X | | | | | | | X | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 15 |  |  |  |  |  | X | | | X | | | X | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 5 |  |  |  |  |  | X | | | X | X | | X | | | | | | X | | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | X | | X | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 12 | 5 |  |  |  |  |  | X | | X | X | X | | X | | X | | X | | X | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 13 | 6 |  |  |  |  |  | X | | | | | | X | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X |
|
tomekkorbak/hh-rlhf_with_features | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: helpfulness_chosen
dtype: int64
- name: helpfulness_rejected
dtype: int64
- name: specificity_chosen
dtype: int64
- name: specificity_rejected
dtype: int64
- name: intent_chosen
dtype: int64
- name: intent_rejected
dtype: int64
- name: factuality_chosen
dtype: int64
- name: factuality_rejected
dtype: int64
- name: easy-to-understand_chosen
dtype: int64
- name: easy-to-understand_rejected
dtype: int64
- name: relevance_chosen
dtype: int64
- name: relevance_rejected
dtype: int64
- name: readability_chosen
dtype: int64
- name: readability_rejected
dtype: int64
- name: enough-detail_chosen
dtype: int64
- name: enough-detail_rejected
dtype: int64
- name: biased:_chosen
dtype: int64
- name: biased:_rejected
dtype: int64
- name: fail-to-consider-individual-preferences_chosen
dtype: int64
- name: fail-to-consider-individual-preferences_rejected
dtype: int64
- name: repetetive_chosen
dtype: int64
- name: repetetive_rejected
dtype: int64
- name: fail-to-consider-context_chosen
dtype: int64
- name: fail-to-consider-context_rejected
dtype: int64
- name: too-long_chosen
dtype: int64
- name: too-long_rejected
dtype: int64
splits:
- name: train
num_bytes: 17131073
num_examples: 19148
download_size: 8313539
dataset_size: 17131073
---
# Dataset Card for "hh-rlhf_with_features"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Curvature/testdata | ---
license: other
---
|
morj/renaissance_portraits | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-to-image
language:
- en
pretty_name: renaissance_portraits
size_categories:
- n<1K
tags:
- art
- renaissance
- finetune
---
## Data
This datset is uploaded as a .tar.gz file that was orginally used to finetune a stable diffusion model.
It consists of 11 Renaissance era portraits of human figures whom are often rendered in dynamic poses, showing expression amd possibly using gesture.
Renaissance portraits are characterized by realism, with the subject being the focus of the work and the background being plain.
Additionally, the file includes a .csv file with two columns, one that serves as a placeholder for an image path and the other for textual description used in training the model.
Image Format: .jpg <br>
Image Size: 256 x 256px |
Hack90/ncbi_genbank_part_6 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
splits:
- name: train
num_bytes: 20271283259
num_examples: 83750
download_size: 9364736791
dataset_size: 20271283259
---
# Dataset Card for "ncbi_genbank_part_6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_diffnamehard__Psyfighter2-Noromaid-ties-Capybara-13B | ---
pretty_name: Evaluation run of diffnamehard/Psyfighter2-Noromaid-ties-Capybara-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [diffnamehard/Psyfighter2-Noromaid-ties-Capybara-13B](https://huggingface.co/diffnamehard/Psyfighter2-Noromaid-ties-Capybara-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_diffnamehard__Psyfighter2-Noromaid-ties-Capybara-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T20:20:55.847857](https://huggingface.co/datasets/open-llm-leaderboard/details_diffnamehard__Psyfighter2-Noromaid-ties-Capybara-13B/blob/main/results_2024-01-13T20-20-55.847857.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5664354333372791,\n\
\ \"acc_stderr\": 0.033523024960411534,\n \"acc_norm\": 0.5714502762424973,\n\
\ \"acc_norm_stderr\": 0.034221321166461816,\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.016656997109125136,\n \"mc2\": 0.5143942772336377,\n\
\ \"mc2_stderr\": 0.015015865193028501\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5861774744027304,\n \"acc_stderr\": 0.014392730009221009,\n\
\ \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.014163366896192598\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.638020314678351,\n\
\ \"acc_stderr\": 0.004795908282584543,\n \"acc_norm\": 0.8386775542720574,\n\
\ \"acc_norm_stderr\": 0.003670763673792967\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483205,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483205\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.032579014820998356,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.032579014820998356\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3253968253968254,\n \"acc_stderr\": 0.024130158299762613,\n \"\
acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.024130158299762613\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6709677419354839,\n \"acc_stderr\": 0.02672949906834996,\n \"\
acc_norm\": 0.6709677419354839,\n \"acc_norm_stderr\": 0.02672949906834996\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438804,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438804\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7171717171717171,\n \"acc_stderr\": 0.03208779558786752,\n \"\
acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.03208779558786752\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624526,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624526\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850154,\n\
\ \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850154\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7376146788990826,\n \"acc_stderr\": 0.018861885021534734,\n \"\
acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.018861885021534734\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696042,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696042\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070415,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070415\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.02514093595033544,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.02514093595033544\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7484035759897829,\n\
\ \"acc_stderr\": 0.015517322365529638,\n \"acc_norm\": 0.7484035759897829,\n\
\ \"acc_norm_stderr\": 0.015517322365529638\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.025950054337654075,\n\
\ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.025950054337654075\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46256983240223465,\n\
\ \"acc_stderr\": 0.016675578687308082,\n \"acc_norm\": 0.46256983240223465,\n\
\ \"acc_norm_stderr\": 0.016675578687308082\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.027530078447110307,\n\
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.027530078447110307\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.02709865262130175,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.02709865262130175\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6234567901234568,\n \"acc_stderr\": 0.02695934451874778,\n\
\ \"acc_norm\": 0.6234567901234568,\n \"acc_norm_stderr\": 0.02695934451874778\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4406779661016949,\n\
\ \"acc_stderr\": 0.012680037994097074,\n \"acc_norm\": 0.4406779661016949,\n\
\ \"acc_norm_stderr\": 0.012680037994097074\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324224,\n \
\ \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324224\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425464,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425464\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.029705284056772436,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.029705284056772436\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.016656997109125136,\n \"mc2\": 0.5143942772336377,\n\
\ \"mc2_stderr\": 0.015015865193028501\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838234\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.30401819560272936,\n \
\ \"acc_stderr\": 0.012670420440198662\n }\n}\n```"
repo_url: https://huggingface.co/diffnamehard/Psyfighter2-Noromaid-ties-Capybara-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|arc:challenge|25_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|gsm8k|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hellaswag|10_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T20-20-55.847857.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T20-20-55.847857.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- '**/details_harness|winogrande|5_2024-01-13T20-20-55.847857.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T20-20-55.847857.parquet'
- config_name: results
data_files:
- split: 2024_01_13T20_20_55.847857
path:
- results_2024-01-13T20-20-55.847857.parquet
- split: latest
path:
- results_2024-01-13T20-20-55.847857.parquet
---
# Dataset Card for Evaluation run of diffnamehard/Psyfighter2-Noromaid-ties-Capybara-13B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [diffnamehard/Psyfighter2-Noromaid-ties-Capybara-13B](https://huggingface.co/diffnamehard/Psyfighter2-Noromaid-ties-Capybara-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_diffnamehard__Psyfighter2-Noromaid-ties-Capybara-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T20:20:55.847857](https://huggingface.co/datasets/open-llm-leaderboard/details_diffnamehard__Psyfighter2-Noromaid-ties-Capybara-13B/blob/main/results_2024-01-13T20-20-55.847857.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5664354333372791,
"acc_stderr": 0.033523024960411534,
"acc_norm": 0.5714502762424973,
"acc_norm_stderr": 0.034221321166461816,
"mc1": 0.3463892288861689,
"mc1_stderr": 0.016656997109125136,
"mc2": 0.5143942772336377,
"mc2_stderr": 0.015015865193028501
},
"harness|arc:challenge|25": {
"acc": 0.5861774744027304,
"acc_stderr": 0.014392730009221009,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.014163366896192598
},
"harness|hellaswag|10": {
"acc": 0.638020314678351,
"acc_stderr": 0.004795908282584543,
"acc_norm": 0.8386775542720574,
"acc_norm_stderr": 0.003670763673792967
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.040335656678483205,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.040335656678483205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.032579014820998356,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.032579014820998356
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.024130158299762613,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.024130158299762613
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.02672949906834996,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.02672949906834996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438804,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438804
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7171717171717171,
"acc_stderr": 0.03208779558786752,
"acc_norm": 0.7171717171717171,
"acc_norm_stderr": 0.03208779558786752
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624526,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624526
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5307692307692308,
"acc_stderr": 0.025302958890850154,
"acc_norm": 0.5307692307692308,
"acc_norm_stderr": 0.025302958890850154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.018861885021534734,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.018861885021534734
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696042,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696042
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070415,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070415
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.02514093595033544,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.02514093595033544
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7484035759897829,
"acc_stderr": 0.015517322365529638,
"acc_norm": 0.7484035759897829,
"acc_norm_stderr": 0.015517322365529638
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.025950054337654075,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.025950054337654075
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46256983240223465,
"acc_stderr": 0.016675578687308082,
"acc_norm": 0.46256983240223465,
"acc_norm_stderr": 0.016675578687308082
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.027530078447110307,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.027530078447110307
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.02709865262130175,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.02709865262130175
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6234567901234568,
"acc_stderr": 0.02695934451874778,
"acc_norm": 0.6234567901234568,
"acc_norm_stderr": 0.02695934451874778
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4406779661016949,
"acc_stderr": 0.012680037994097074,
"acc_norm": 0.4406779661016949,
"acc_norm_stderr": 0.012680037994097074
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.03029061918048569,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.03029061918048569
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324224,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324224
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425464,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425464
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772436,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772436
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3463892288861689,
"mc1_stderr": 0.016656997109125136,
"mc2": 0.5143942772336377,
"mc2_stderr": 0.015015865193028501
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838234
},
"harness|gsm8k|5": {
"acc": 0.30401819560272936,
"acc_stderr": 0.012670420440198662
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
chargoddard/QuALITY-instruct | ---
language: en
pretty_name: https://github.com/nyu-mll/quality
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 62125756
num_examples: 2523
- name: dev
num_bytes: 50877356
num_examples: 2086
download_size: 5451636
dataset_size: 113003112
---
# QuALITY: Question Answering with Long Input Texts, Yes!
This is the QuALITY v1.0.1 training set converted to instruction-style prompts. All credit to the original authors. See https://github.com/nyu-mll/quality for details. |
JUsnin1101/2023 | ---
license: openrail
---
|
Nandini82/sciq-rag-dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3747547
num_examples: 6334
- name: validation
num_bytes: 302566
num_examples: 525
- name: test
num_bytes: 333672
num_examples: 551
download_size: 2328955
dataset_size: 4383785
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
rcugarte/font_training_sample | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 14574295.720496895
num_examples: 289
download_size: 14545480
dataset_size: 14574295.720496895
---
# Dataset Card for "font_training_data_sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/tsushima_yoshiko_lovelivesunshine | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tsushima_yoshiko/津島善子/츠시마요시코 (Love Live! Sunshine!!)
This is the dataset of tsushima_yoshiko/津島善子/츠시마요시코 (Love Live! Sunshine!!), containing 500 images and their tags.
The core tags of this character are `blue_hair, bangs, long_hair, hair_bun, purple_eyes, single_hair_bun, single_side_bun`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 734.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsushima_yoshiko_lovelivesunshine/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 387.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsushima_yoshiko_lovelivesunshine/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1232 | 852.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsushima_yoshiko_lovelivesunshine/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 634.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsushima_yoshiko_lovelivesunshine/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1232 | 1.23 GiB | [Download](https://huggingface.co/datasets/CyberHarem/tsushima_yoshiko_lovelivesunshine/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tsushima_yoshiko_lovelivesunshine',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, skirt, solo, earrings, hair_ornament, midriff, navel, one_eye_closed, open_mouth, smile, white_gloves, blush, red_eyes, star_(symbol), v_over_eye, white_thighhighs, ;d, character_name, frills |
| 1 | 5 |  |  |  |  |  | 1girl, serafuku, solo, uranohoshi_school_uniform, looking_at_viewer, open_mouth, smile, blush, one_eye_closed, skirt, ;d, white_background |
| 2 | 9 |  |  |  |  |  | 1girl, black_thighhighs, long_sleeves, looking_at_viewer, pleated_skirt, serafuku, solo, uranohoshi_school_uniform, zettai_ryouiki, grey_skirt, simple_background, white_background, blush, miniskirt, smile, grey_sailor_collar, shirt, yellow_bowtie, collarbone, shiny_hair, sitting, standing |
| 3 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, serafuku, solo, uranohoshi_school_uniform, grey_skirt, pleated_skirt, short_sleeves, smile, white_shirt, blush, grey_sailor_collar, simple_background, black_thighhighs, closed_mouth, miniskirt, shiny_hair, white_background, yellow_neckerchief, zettai_ryouiki, breasts, red_neckerchief, sleeveless |
| 4 | 12 |  |  |  |  |  | 1girl, solo, looking_at_viewer, smile, white_shirt, blush, feather_hair_ornament, short_sleeves, black_capelet, black_skirt, heart_earrings, holding_staff, dated, open_mouth |
| 5 | 14 |  |  |  |  |  | 1girl, solo, looking_at_viewer, smile, black_wings, elbow_gloves, black_dress, black_gloves, black_thighhighs, blush, medium_breasts, one_eye_closed, feathers, halterneck, open_mouth |
| 6 | 9 |  |  |  |  |  | 1girl, black_gloves, fingerless_gloves, hat, looking_at_viewer, short_sleeves, solo, black_headwear, feathered_wings, shiny_hair, smile, striped_thighhighs, epaulettes, nail_polish, black_wings, sitting, white_feathers, black_dress, black_nails, white_wings |
| 7 | 6 |  |  |  |  |  | 1girl, demon_horns, demon_tail, demon_wings, looking_at_viewer, solo, earrings, heart, hood_up, striped_thighhighs, detached_sleeves, garter_straps, nail_polish, skirt, black_footwear, blush, boots, breasts, demon_girl, lollipop, smile |
| 8 | 21 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, black_bikini, navel, medium_breasts, open_mouth, cleavage, collarbone, day, cloud, :d, outdoors, blue_sky, earrings |
| 9 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, solo, animal_hood, chinese_clothes, clothing_cutout, thighhighs, blush, earrings, hair_ornament, sleeves_past_fingers, smile, boots, hood_up, stuffed_animal, black_footwear, blue_skirt, detached_sleeves, floral_print, long_sleeves, open_mouth, panda, striped, v-shaped_eyebrows |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | skirt | solo | earrings | hair_ornament | midriff | navel | one_eye_closed | open_mouth | smile | white_gloves | blush | red_eyes | star_(symbol) | v_over_eye | white_thighhighs | ;d | character_name | frills | serafuku | uranohoshi_school_uniform | white_background | black_thighhighs | long_sleeves | pleated_skirt | zettai_ryouiki | grey_skirt | simple_background | miniskirt | grey_sailor_collar | shirt | yellow_bowtie | collarbone | shiny_hair | sitting | standing | short_sleeves | white_shirt | closed_mouth | yellow_neckerchief | breasts | red_neckerchief | sleeveless | feather_hair_ornament | black_capelet | black_skirt | heart_earrings | holding_staff | dated | black_wings | elbow_gloves | black_dress | black_gloves | medium_breasts | feathers | halterneck | fingerless_gloves | hat | black_headwear | feathered_wings | striped_thighhighs | epaulettes | nail_polish | white_feathers | black_nails | white_wings | demon_horns | demon_tail | demon_wings | heart | hood_up | detached_sleeves | garter_straps | black_footwear | boots | demon_girl | lollipop | black_bikini | cleavage | day | cloud | :d | outdoors | blue_sky | animal_hood | chinese_clothes | clothing_cutout | thighhighs | sleeves_past_fingers | stuffed_animal | blue_skirt | floral_print | panda | striped | v-shaped_eyebrows |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:-----------|:----------------|:----------|:--------|:-----------------|:-------------|:--------|:---------------|:--------|:-----------|:----------------|:-------------|:-------------------|:-----|:-----------------|:---------|:-----------|:----------------------------|:-------------------|:-------------------|:---------------|:----------------|:-----------------|:-------------|:--------------------|:------------|:---------------------|:--------|:----------------|:-------------|:-------------|:----------|:-----------|:----------------|:--------------|:---------------|:---------------------|:----------|:------------------|:-------------|:------------------------|:----------------|:--------------|:-----------------|:----------------|:--------|:--------------|:---------------|:--------------|:---------------|:-----------------|:-----------|:-------------|:--------------------|:------|:-----------------|:------------------|:---------------------|:-------------|:--------------|:-----------------|:--------------|:--------------|:--------------|:-------------|:--------------|:--------|:----------|:-------------------|:----------------|:-----------------|:--------|:-------------|:-----------|:---------------|:-----------|:------|:--------|:-----|:-----------|:-----------|:--------------|:------------------|:------------------|:-------------|:-----------------------|:-----------------|:-------------|:---------------|:--------|:----------|:--------------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | | | | X | X | X | | X | | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | | X | | | | | | | X | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | | X | | | | | | | X | | X | | | | | | | | X | X | X | X | | X | X | X | X | X | X | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 12 |  |  |  |  |  | X | X | | X | | | | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 14 |  |  |  |  |  | X | X | | X | | | | | X | X | X | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | X | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | | X | | | | | | | | | | | | | X | | X | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | X | X | X | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 8 | 21 |  |  |  |  |  | X | X | | X | X | | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | X | | X | X | X | | | | X | X | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
NickyNicky/aya_dataset_multilingual_inputs_targets_ext4 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: language
dtype: string
- name: language_code
dtype: string
- name: targets_es
dtype: string
- name: targets_en
dtype: string
- name: targets_fr
dtype: string
- name: targets_de
dtype: string
- name: inputs_es
dtype: string
- name: inputs_en
dtype: string
- name: inputs_fr
dtype: string
- name: inputs_de
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3018919
num_examples: 1000
download_size: 1940201
dataset_size: 3018919
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
skytnt/anime-segmentation | ---
annotations_creators: []
language: []
language_creators: []
license:
- cc0-1.0
multilinguality: []
pretty_name: Anime Segmentation
size_categories:
- 10K<n<100K
source_datasets:
- original
tags: []
task_categories:
- image-segmentation
task_ids:
- semantic-segmentation
---
## Dataset Description
A segmentation dataset for anime character
My project: [anime-segmentation](https://github.com/SkyTNT/anime-segmentation)
### Dataset Summary
| Dir | Description | Format | Images |
| ---- | ---- | ---- | ---- |
| bg | background images | jpg | 8057 |
| fg | foreground images, transparent background | png | 11802 |
| imgs | real images with background and foreground| jpg | 1111 |
| masks| labels for imgs | jpg | 1111 |
Total size: 18GB
### Collection Method
Collect background from [character_bg_seg_data](https://github.com/ShuhongChen/bizarre-pose-estimator#download)
Collect foreground from danbooru website.
Collect imgs and masks from [AniSeg](https://github.com/jerryli27/AniSeg#about-the-models) and danbooru website.
I use [Real-ESRGAN](https://github.com/xinntao/Real-ESRGAN) to restore the background images.
I clean the dataset using [DeepDanbooru](https://github.com/KichangKim/DeepDanbooru) first then manually, to make sue all foreground is anime character.
### Contributions
Thanks to [@SkyTNT](https://github.com/SkyTNT) for adding this dataset.
Thanks to [@ShuhongChen](https://github.com/ShuhongChen) for [character_bg_seg_data](https://github.com/ShuhongChen/bizarre-pose-estimator#download)
Thanks to [@jerryli27](https://github.com/jerryli27) for [AniSeg](https://github.com/jerryli27/AniSeg#about-the-models)
|
n3rd0/DreamBook_Guanaco_Format | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2514729
num_examples: 8548
- name: test
num_bytes: 301023
num_examples: 949
download_size: 1066863
dataset_size: 2815752
---
# Dataset Card for "DreamBook_Guanaco_Format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713217742 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 34185
num_examples: 96
download_size: 26375
dataset_size: 34185
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713217742"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cartinoe5930/few-shot-qwen-1.8b | ---
dataset_info:
features:
- name: response
dtype: string
- name: predictied_answer
dtype: int64
- name: actual_answer
dtype: int64
splits:
- name: train
num_bytes: 703249
num_examples: 1319
download_size: 272558
dataset_size: 703249
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HydraLM/unnatural-instructions_standardized | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 97758871
num_examples: 722010
download_size: 23018949
dataset_size: 97758871
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "unnatural-instructions_standardized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HAERAE-HUB/HAE_RAE_BENCH_1.1 | ---
license: cc-by-nc-nd-4.0
extra_gated_prompt: To request access to the dataset, please fill out this form, and
we'll review and let you know if your use case is approved.
extra_gated_fields:
First Name: text
Last Name: text
Institution: text
Intended Use: text
I agree to use this dataset for non-commercial research ONLY: checkbox
task_categories:
- multiple-choice
language:
- ko
configs:
- config_name: correct_definition_matching
data_files:
- split: test
path: "data/correct_definition_matching-00000-of-00001-65122a307e90d70a.parquet"
- config_name: csat_geo
data_files:
- split: test
path: "data/csat_geo-00000-of-00001-68ce37d4952173a9.parquet"
- config_name: csat_law
data_files:
- split: test
path: "data/csat_law-00000-of-00001-ef0ec0fac2f2238f.parquet"
- config_name: csat_socio
data_files:
- split: test
path: "data/csat_socio-00000-of-00001-0cfadd93827d0d48.parquet"
- config_name: date_understanding
data_files:
- split: test
path: "data/date_understanding-00000-of-00001-36887d05afd4c57f.parquet"
- config_name: general_knowledge
data_files:
- split: test
path: "data/general_knowledge-00000-of-00001-7418d1a0b8c4a615.parquet"
- config_name: history
data_files:
- split: test
path: "data/history-00000-of-00001-9ad551d475d58413.parquet"
- config_name: loan_words
data_files:
- split: test
path: "data/loan_words-00000-of-00001-96ed08632b9a2ae8.parquet"
- config_name: lyrics_denoising
data_files:
- split: test
path: "data/lyrics_denoising-00000-of-00001-ede1ce975e9bad54.parquet"
- config_name: proverbs_denoising
data_files:
- split: test
path: "data/proverbs_denoising-00000-of-00001-f274ad136676b2e3.parquet"
- config_name: rare_words
data_files:
- split: test
path: "data/rare_words-00000-of-00001-4f34314c09209524.parquet"
- config_name: standard_nomenclature
data_files:
- split: test
path: "data/standard_nomenclature-00000-of-00001-52814989a6843328.parquet"
- config_name: reading_comprehension
data_files:
- split: test
path: "data/reading_comprehension-00000-of-00001-f9c8df20c22e46d0.parquet"
---
The HAE_RAE_BENCH 1.1 is an ongoing project to develop a suite of evaluation tasks designed to test the
understanding of models regarding Korean cultural and contextual nuances.
Currently, it comprises 13 distinct tasks, with a total of 4900 instances.
Please note that although this repository contains datasets from the original [HAE-RAE BENCH paper](https://arxiv.org/abs/2309.02706),
the contents are not completely identical. Specifically, the reading comprehension subset from the original version has been removed due to copyright constraints.
In its place, an updated reading comprehension subset has been introduced, sourced from the CSAT, the Korean university entrance examination.
To replicate the studies from the paper, please see [code](https://huggingface.co/datasets/HAERAE-HUB/HAE_RAE_BENCH_1.0).
### Dataset Overview
| Task | Instances | Version | Explanation |
|-----------------------------|-----------|---------|---------------------------------------------------------------------|
| standard_nomenclature | 153 | v1.0 | Multiple-choice questions about Korean standard nomenclatures from NIKL. |
| loan_word | 169 | v1.0 | Multiple-choice questions about Korean loan words from NIKL. |
| rare_word | 405 | v1.0 | Multiple-choice questions about rare Korean words from NIKL. |
| general_knowledge | 176 | v1.0 | Multiple-choice questions on Korean cultural knowledge. |
| history | 188 | v1.0 | Multiple-choice questions on Korean history. |
| date_understanding | 475 | v1.1 | Multiple-choice questions about understanding Korean date expressions. |
| proverbs_denoising | 672 | v1.1 | Questions for generating Korean proverbs. |
| correct_definition_matching | 439 | v1.1 | Multiple-choice questions on matching the correct definition to words in context. |
| lyrics_denoising | 622 | v1.1 | Questions for reconstructing Korean pop song lyrics. |
| reading_comprehension | 936 | v1.1 | Multiple-choice questions on Korean reading comprehension from the CSAT. |
| csat_law | 217 | v1.1 | Multiple-choice questions on Korean law from the CSAT. |
| csat_geo | 150 | v1.1 | Multiple-choice questions on Korean geography from the CSAT. |
| csat_socio | 298 | v1.1 | Multiple-choice questions on Korean social studies from the CSAT. |
| **Total** | **4900** | | |
### Point of Contact
For any questions contact us via the following email:)
```
spthsrbwls123@yonsei.ac.kr
``` |
abcefgdfdsf/stablediff | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_DreadPoor__complect-7B-slerp | ---
pretty_name: Evaluation run of DreadPoor/complect-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DreadPoor/complect-7B-slerp](https://huggingface.co/DreadPoor/complect-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__complect-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-07T15:30:34.734820](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__complect-7B-slerp/blob/main/results_2024-03-07T15-30-34.734820.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.654501445035387,\n\
\ \"acc_stderr\": 0.032055837406835125,\n \"acc_norm\": 0.654143289555927,\n\
\ \"acc_norm_stderr\": 0.032724844511645854,\n \"mc1\": 0.5605875152998776,\n\
\ \"mc1_stderr\": 0.017374520482513704,\n \"mc2\": 0.711356205167106,\n\
\ \"mc2_stderr\": 0.01471283539228549\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6953924914675768,\n \"acc_stderr\": 0.013449522109932487,\n\
\ \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059376\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7094204341764588,\n\
\ \"acc_stderr\": 0.004531019159414106,\n \"acc_norm\": 0.8818960366460864,\n\
\ \"acc_norm_stderr\": 0.003220716126685026\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"\
acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033484,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033484\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886783,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886783\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461766,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461766\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n\
\ \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.013547415658662264,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.013547415658662264\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40893854748603353,\n\
\ \"acc_stderr\": 0.01644283065471554,\n \"acc_norm\": 0.40893854748603353,\n\
\ \"acc_norm_stderr\": 0.01644283065471554\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015055,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015055\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5605875152998776,\n\
\ \"mc1_stderr\": 0.017374520482513704,\n \"mc2\": 0.711356205167106,\n\
\ \"mc2_stderr\": 0.01471283539228549\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8453038674033149,\n \"acc_stderr\": 0.010163172650433535\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6846095526914329,\n \
\ \"acc_stderr\": 0.012799353675801834\n }\n}\n```"
repo_url: https://huggingface.co/DreadPoor/complect-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|arc:challenge|25_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|gsm8k|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hellaswag|10_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T15-30-34.734820.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T15-30-34.734820.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- '**/details_harness|winogrande|5_2024-03-07T15-30-34.734820.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-07T15-30-34.734820.parquet'
- config_name: results
data_files:
- split: 2024_03_07T15_30_34.734820
path:
- results_2024-03-07T15-30-34.734820.parquet
- split: latest
path:
- results_2024-03-07T15-30-34.734820.parquet
---
# Dataset Card for Evaluation run of DreadPoor/complect-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/complect-7B-slerp](https://huggingface.co/DreadPoor/complect-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__complect-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-07T15:30:34.734820](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__complect-7B-slerp/blob/main/results_2024-03-07T15-30-34.734820.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.654501445035387,
"acc_stderr": 0.032055837406835125,
"acc_norm": 0.654143289555927,
"acc_norm_stderr": 0.032724844511645854,
"mc1": 0.5605875152998776,
"mc1_stderr": 0.017374520482513704,
"mc2": 0.711356205167106,
"mc2_stderr": 0.01471283539228549
},
"harness|arc:challenge|25": {
"acc": 0.6953924914675768,
"acc_stderr": 0.013449522109932487,
"acc_norm": 0.7226962457337884,
"acc_norm_stderr": 0.013082095839059376
},
"harness|hellaswag|10": {
"acc": 0.7094204341764588,
"acc_stderr": 0.004531019159414106,
"acc_norm": 0.8818960366460864,
"acc_norm_stderr": 0.003220716126685026
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033484,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886783,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886783
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461766,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461766
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662264,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662264
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40893854748603353,
"acc_stderr": 0.01644283065471554,
"acc_norm": 0.40893854748603353,
"acc_norm_stderr": 0.01644283065471554
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015055,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015055
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5605875152998776,
"mc1_stderr": 0.017374520482513704,
"mc2": 0.711356205167106,
"mc2_stderr": 0.01471283539228549
},
"harness|winogrande|5": {
"acc": 0.8453038674033149,
"acc_stderr": 0.010163172650433535
},
"harness|gsm8k|5": {
"acc": 0.6846095526914329,
"acc_stderr": 0.012799353675801834
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
silvershades483/Hindi-ASR | ---
pretty_name: Hindi-ASR
---
This is a dataset forfine tuning WHisper on Hindi audio |
kejian/codeparrot-train-more-filter-3.3b-cleaned | ---
annotations_creators:
- machine-generated
source_datasets:
- codeparrot
task_categories:
- text-classification
tags:
- pretraining-with-human-feedback
- pep8
- python
- codeparrot
size_categories:
- 100K<n<1M
license: mit
--- |
beskrovnykh/daniel-dataset-part-2 | ---
dataset_info:
features:
- name: title
dtype: string
- name: published
dtype: string
- name: url
dtype: string
- name: id
dtype: string
- name: text
dtype: string
- name: start
dtype: float64
- name: end
dtype: float64
splits:
- name: train
num_bytes: 25686985
num_examples: 98360
download_size: 4557381
dataset_size: 25686985
---
# Dataset Card for "daniel-dataset-part-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
basilis/wvDatasetFinal | ---
dataset_info:
features:
- name: final_text
sequence: string
splits:
- name: train
num_bytes: 6722209183
num_examples: 97928
download_size: 1660170870
dataset_size: 6722209183
---
# Dataset Card for "wvDatasetFinal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_189 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 24485516640.75
num_examples: 254930
download_size: 22019228379
dataset_size: 24485516640.75
---
# Dataset Card for "chunk_189"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/chemistry_dataset_standardized_cluster_3_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 16204599
num_examples: 10664
download_size: 7455328
dataset_size: 16204599
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "chemistry_dataset_standardized_cluster_3_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jambroz__sixtyoneeighty-FNCARL-7B-slerp | ---
pretty_name: Evaluation run of jambroz/sixtyoneeighty-FNCARL-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jambroz/sixtyoneeighty-FNCARL-7B-slerp](https://huggingface.co/jambroz/sixtyoneeighty-FNCARL-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jambroz__sixtyoneeighty-FNCARL-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-07T14:48:15.370994](https://huggingface.co/datasets/open-llm-leaderboard/details_jambroz__sixtyoneeighty-FNCARL-7B-slerp/blob/main/results_2024-03-07T14-48-15.370994.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6579575079947922,\n\
\ \"acc_stderr\": 0.03184599599003659,\n \"acc_norm\": 0.6576222975014011,\n\
\ \"acc_norm_stderr\": 0.03250733079748358,\n \"mc1\": 0.554467564259486,\n\
\ \"mc1_stderr\": 0.017399335280140347,\n \"mc2\": 0.7151537867325115,\n\
\ \"mc2_stderr\": 0.014552350700213698\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6825938566552902,\n \"acc_stderr\": 0.013602239088038167,\n\
\ \"acc_norm\": 0.7158703071672355,\n \"acc_norm_stderr\": 0.013179442447653886\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6999601672973511,\n\
\ \"acc_stderr\": 0.004573383672159082,\n \"acc_norm\": 0.8778131846245768,\n\
\ \"acc_norm_stderr\": 0.003268321260913629\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.035506839891655796,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.035506839891655796\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067884,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067884\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657262,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657262\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253262,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253262\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\"\
: 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8227848101265823,\n \"acc_stderr\": 0.02485636418450322,\n \"\
acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.02485636418450322\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.013468201614066297,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.013468201614066297\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4424581005586592,\n\
\ \"acc_stderr\": 0.01661139368726858,\n \"acc_norm\": 0.4424581005586592,\n\
\ \"acc_norm_stderr\": 0.01661139368726858\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729484,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729484\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n\
\ \"acc_stderr\": 0.012740853872949829,\n \"acc_norm\": 0.4661016949152542,\n\
\ \"acc_norm_stderr\": 0.012740853872949829\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.554467564259486,\n\
\ \"mc1_stderr\": 0.017399335280140347,\n \"mc2\": 0.7151537867325115,\n\
\ \"mc2_stderr\": 0.014552350700213698\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8318863456985004,\n \"acc_stderr\": 0.01051033695416674\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7149355572403336,\n \
\ \"acc_stderr\": 0.012435042334904013\n }\n}\n```"
repo_url: https://huggingface.co/jambroz/sixtyoneeighty-FNCARL-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|arc:challenge|25_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|gsm8k|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hellaswag|10_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T14-48-15.370994.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T14-48-15.370994.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- '**/details_harness|winogrande|5_2024-03-07T14-48-15.370994.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-07T14-48-15.370994.parquet'
- config_name: results
data_files:
- split: 2024_03_07T14_48_15.370994
path:
- results_2024-03-07T14-48-15.370994.parquet
- split: latest
path:
- results_2024-03-07T14-48-15.370994.parquet
---
# Dataset Card for Evaluation run of jambroz/sixtyoneeighty-FNCARL-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jambroz/sixtyoneeighty-FNCARL-7B-slerp](https://huggingface.co/jambroz/sixtyoneeighty-FNCARL-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jambroz__sixtyoneeighty-FNCARL-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-07T14:48:15.370994](https://huggingface.co/datasets/open-llm-leaderboard/details_jambroz__sixtyoneeighty-FNCARL-7B-slerp/blob/main/results_2024-03-07T14-48-15.370994.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6579575079947922,
"acc_stderr": 0.03184599599003659,
"acc_norm": 0.6576222975014011,
"acc_norm_stderr": 0.03250733079748358,
"mc1": 0.554467564259486,
"mc1_stderr": 0.017399335280140347,
"mc2": 0.7151537867325115,
"mc2_stderr": 0.014552350700213698
},
"harness|arc:challenge|25": {
"acc": 0.6825938566552902,
"acc_stderr": 0.013602239088038167,
"acc_norm": 0.7158703071672355,
"acc_norm_stderr": 0.013179442447653886
},
"harness|hellaswag|10": {
"acc": 0.6999601672973511,
"acc_stderr": 0.004573383672159082,
"acc_norm": 0.8778131846245768,
"acc_norm_stderr": 0.003268321260913629
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.035506839891655796,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.035506839891655796
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067884,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067884
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253262,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253262
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616323,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.02485636418450322,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.02485636418450322
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066297,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066297
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4424581005586592,
"acc_stderr": 0.01661139368726858,
"acc_norm": 0.4424581005586592,
"acc_norm_stderr": 0.01661139368726858
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.012740853872949829,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.012740853872949829
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.554467564259486,
"mc1_stderr": 0.017399335280140347,
"mc2": 0.7151537867325115,
"mc2_stderr": 0.014552350700213698
},
"harness|winogrande|5": {
"acc": 0.8318863456985004,
"acc_stderr": 0.01051033695416674
},
"harness|gsm8k|5": {
"acc": 0.7149355572403336,
"acc_stderr": 0.012435042334904013
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Wanfq__FuseLLM-7B | ---
pretty_name: Evaluation run of Wanfq/FuseLLM-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Wanfq/FuseLLM-7B](https://huggingface.co/Wanfq/FuseLLM-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Wanfq__FuseLLM-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T16:48:44.963342](https://huggingface.co/datasets/open-llm-leaderboard/details_Wanfq__FuseLLM-7B/blob/main/results_2024-02-01T16-48-44.963342.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4801705727969372,\n\
\ \"acc_stderr\": 0.034512318476802376,\n \"acc_norm\": 0.48541393567248253,\n\
\ \"acc_norm_stderr\": 0.035295245059963676,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.01507721920066259,\n \"mc2\": 0.38170797049642685,\n\
\ \"mc2_stderr\": 0.013464482874681617\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4991467576791809,\n \"acc_stderr\": 0.014611369529813272,\n\
\ \"acc_norm\": 0.5324232081911263,\n \"acc_norm_stderr\": 0.01458063756999542\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5878311093407688,\n\
\ \"acc_stderr\": 0.004912192800263312,\n \"acc_norm\": 0.7871937860983867,\n\
\ \"acc_norm_stderr\": 0.00408455264190366\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.44528301886792454,\n \"acc_stderr\": 0.030588052974270655,\n\
\ \"acc_norm\": 0.44528301886792454,\n \"acc_norm_stderr\": 0.030588052974270655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.023000086859068642,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.023000086859068642\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n\
\ \"acc_stderr\": 0.02841498501970786,\n \"acc_norm\": 0.5225806451612903,\n\
\ \"acc_norm_stderr\": 0.02841498501970786\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5404040404040404,\n \"acc_stderr\": 0.035507024651313425,\n \"\
acc_norm\": 0.5404040404040404,\n \"acc_norm_stderr\": 0.035507024651313425\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.03292296639155141,\n\
\ \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.03292296639155141\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44358974358974357,\n \"acc_stderr\": 0.025189149894764198,\n\
\ \"acc_norm\": 0.44358974358974357,\n \"acc_norm_stderr\": 0.025189149894764198\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.02742001935094528,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.02742001935094528\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n\
\ \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.653211009174312,\n\
\ \"acc_stderr\": 0.020406097104093027,\n \"acc_norm\": 0.653211009174312,\n\
\ \"acc_norm_stderr\": 0.020406097104093027\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.03141554629402544,\n\
\ \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.03141554629402544\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5784313725490197,\n \"acc_stderr\": 0.03465868196380762,\n \"\
acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.03465868196380762\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6624472573839663,\n \"acc_stderr\": 0.03078154910202622,\n \
\ \"acc_norm\": 0.6624472573839663,\n \"acc_norm_stderr\": 0.03078154910202622\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n\
\ \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.5515695067264574,\n\
\ \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.043482080516448585,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.043482080516448585\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.04266416363352168,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.04266416363352168\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536824,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536824\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.50920245398773,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.50920245398773,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.046355501356099754,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.046355501356099754\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.04911147107365777,\n\
\ \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.04911147107365777\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.688034188034188,\n\
\ \"acc_stderr\": 0.03035152732334493,\n \"acc_norm\": 0.688034188034188,\n\
\ \"acc_norm_stderr\": 0.03035152732334493\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6679438058748404,\n\
\ \"acc_stderr\": 0.01684117465529572,\n \"acc_norm\": 0.6679438058748404,\n\
\ \"acc_norm_stderr\": 0.01684117465529572\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.026915047355369794,\n\
\ \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.026915047355369794\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.028624412550167958,\n\
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.028624412550167958\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n\
\ \"acc_stderr\": 0.027882383791325956,\n \"acc_norm\": 0.594855305466238,\n\
\ \"acc_norm_stderr\": 0.027882383791325956\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.027744313443376536,\n\
\ \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.027744313443376536\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34159061277705344,\n\
\ \"acc_stderr\": 0.012112391320842849,\n \"acc_norm\": 0.34159061277705344,\n\
\ \"acc_norm_stderr\": 0.012112391320842849\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4526143790849673,\n \"acc_stderr\": 0.020136790918492537,\n \
\ \"acc_norm\": 0.4526143790849673,\n \"acc_norm_stderr\": 0.020136790918492537\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5265306122448979,\n \"acc_stderr\": 0.03196412734523272,\n\
\ \"acc_norm\": 0.5265306122448979,\n \"acc_norm_stderr\": 0.03196412734523272\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n\
\ \"acc_stderr\": 0.03390393042268814,\n \"acc_norm\": 0.6417910447761194,\n\
\ \"acc_norm_stderr\": 0.03390393042268814\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n\
\ \"acc_stderr\": 0.03777798822748017,\n \"acc_norm\": 0.3795180722891566,\n\
\ \"acc_norm_stderr\": 0.03777798822748017\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691584,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691584\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.01507721920066259,\n \"mc2\": 0.38170797049642685,\n\
\ \"mc2_stderr\": 0.013464482874681617\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552666\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14329037149355572,\n \
\ \"acc_stderr\": 0.00965089572335757\n }\n}\n```"
repo_url: https://huggingface.co/Wanfq/FuseLLM-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|arc:challenge|25_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|gsm8k|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hellaswag|10_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T16-48-44.963342.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T16-48-44.963342.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- '**/details_harness|winogrande|5_2024-02-01T16-48-44.963342.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T16-48-44.963342.parquet'
- config_name: results
data_files:
- split: 2024_02_01T16_48_44.963342
path:
- results_2024-02-01T16-48-44.963342.parquet
- split: latest
path:
- results_2024-02-01T16-48-44.963342.parquet
---
# Dataset Card for Evaluation run of Wanfq/FuseLLM-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Wanfq/FuseLLM-7B](https://huggingface.co/Wanfq/FuseLLM-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Wanfq__FuseLLM-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T16:48:44.963342](https://huggingface.co/datasets/open-llm-leaderboard/details_Wanfq__FuseLLM-7B/blob/main/results_2024-02-01T16-48-44.963342.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4801705727969372,
"acc_stderr": 0.034512318476802376,
"acc_norm": 0.48541393567248253,
"acc_norm_stderr": 0.035295245059963676,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.01507721920066259,
"mc2": 0.38170797049642685,
"mc2_stderr": 0.013464482874681617
},
"harness|arc:challenge|25": {
"acc": 0.4991467576791809,
"acc_stderr": 0.014611369529813272,
"acc_norm": 0.5324232081911263,
"acc_norm_stderr": 0.01458063756999542
},
"harness|hellaswag|10": {
"acc": 0.5878311093407688,
"acc_stderr": 0.004912192800263312,
"acc_norm": 0.7871937860983867,
"acc_norm_stderr": 0.00408455264190366
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44528301886792454,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.44528301886792454,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.023000086859068642,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.023000086859068642
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.02841498501970786,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.02841498501970786
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5404040404040404,
"acc_stderr": 0.035507024651313425,
"acc_norm": 0.5404040404040404,
"acc_norm_stderr": 0.035507024651313425
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.03292296639155141,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.03292296639155141
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44358974358974357,
"acc_stderr": 0.025189149894764198,
"acc_norm": 0.44358974358974357,
"acc_norm_stderr": 0.025189149894764198
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.02742001935094528,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.02742001935094528
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.653211009174312,
"acc_stderr": 0.020406097104093027,
"acc_norm": 0.653211009174312,
"acc_norm_stderr": 0.020406097104093027
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03141554629402544,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03141554629402544
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.03465868196380762,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.03465868196380762
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6624472573839663,
"acc_stderr": 0.03078154910202622,
"acc_norm": 0.6624472573839663,
"acc_norm_stderr": 0.03078154910202622
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5515695067264574,
"acc_stderr": 0.03337883736255098,
"acc_norm": 0.5515695067264574,
"acc_norm_stderr": 0.03337883736255098
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.043482080516448585,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.043482080516448585
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.04266416363352168,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.04266416363352168
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536824,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536824
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.50920245398773,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.50920245398773,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.046355501356099754,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.046355501356099754
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.688034188034188,
"acc_stderr": 0.03035152732334493,
"acc_norm": 0.688034188034188,
"acc_norm_stderr": 0.03035152732334493
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6679438058748404,
"acc_stderr": 0.01684117465529572,
"acc_norm": 0.6679438058748404,
"acc_norm_stderr": 0.01684117465529572
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.026915047355369794,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.026915047355369794
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.028624412550167958,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.028624412550167958
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.027882383791325956,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.027882383791325956
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.027744313443376536,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.027744313443376536
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34159061277705344,
"acc_stderr": 0.012112391320842849,
"acc_norm": 0.34159061277705344,
"acc_norm_stderr": 0.012112391320842849
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4526143790849673,
"acc_stderr": 0.020136790918492537,
"acc_norm": 0.4526143790849673,
"acc_norm_stderr": 0.020136790918492537
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5265306122448979,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.5265306122448979,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268814,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268814
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.03777798822748017,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.03777798822748017
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691584,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691584
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.01507721920066259,
"mc2": 0.38170797049642685,
"mc2_stderr": 0.013464482874681617
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552666
},
"harness|gsm8k|5": {
"acc": 0.14329037149355572,
"acc_stderr": 0.00965089572335757
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
FelixdoingAI/IP2P-adwm-128-200 | ---
dataset_info:
features:
- name: original_prompt
dtype: string
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_prompt
dtype: string
- name: edited_image
dtype: image
- name: adversarial_image
dtype: image
splits:
- name: train
num_bytes: 23574196.0
num_examples: 200
download_size: 23573413
dataset_size: 23574196.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
anilguven/turkish_spam_email | ---
license: unknown
task_categories:
- text-classification
language:
- tr
tags:
- turkish
- spam
- email
- text-classification
size_categories:
- 1K<n<10K
---
# Dataset Info
<!-- Provide a quick summary of the dataset. -->
Dataset obtained via https://www.kaggle.com/datasets/emrahaydemr/turkish-mail-dataset-normalspam
|
CyberHarem/tiona_hiryute_isitwrongtotrytopickupgirlsinadungeon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tiona_hiryute (Dungeon ni Deai wo Motomeru no wa Machigatteiru no Darou ka)
This is the dataset of tiona_hiryute (Dungeon ni Deai wo Motomeru no wa Machigatteiru no Darou ka), containing 34 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
sergeipetrov/transformers-diffusers-docs-raw | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2400232
num_examples: 384
download_size: 1116732
dataset_size: 2400232
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jeonseonjin/ezdemo | ---
language:
- en
license: apache-2.0
size_categories:
- 1K<n<10K
configs:
- config_name: default
data_files:
- split: train
path: datasets/bodyPerfor_train*
- split: test
path: datasets/bodyPerfor_train*
--- |
yuan-sf63/word_label_0.8_32_P | ---
dataset_info:
features:
- name: text
dtype: string
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
- name: '16'
dtype: int64
- name: '17'
dtype: int64
- name: '18'
dtype: int64
- name: '19'
dtype: int64
- name: '20'
dtype: int64
- name: '21'
dtype: int64
- name: '22'
dtype: int64
- name: '23'
dtype: int64
- name: '24'
dtype: int64
- name: '25'
dtype: int64
- name: '26'
dtype: int64
- name: '27'
dtype: int64
- name: '28'
dtype: int64
- name: '29'
dtype: int64
- name: '30'
dtype: int64
- name: '31'
dtype: int64
splits:
- name: train
num_bytes: 21904636.001111876
num_examples: 63945
- name: validation
num_bytes: 2434190.9988881224
num_examples: 7106
download_size: 5755330
dataset_size: 24338827.0
---
# Dataset Card for "word_label_0.8_32_P"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
luketheduke/hates_speech_offensive | ---
license: cc-by-4.0
---
|
tyzhu/squad_qa_no_id_v5_full_recite_full_passage_random_permute_rerun_4 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7807335.809388336
num_examples: 4345
- name: validation
num_bytes: 580390
num_examples: 300
download_size: 1701869
dataset_size: 8387725.809388336
---
# Dataset Card for "squad_qa_no_id_v5_full_recite_full_passage_random_permute_rerun_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_wnli_existential_you_have | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 891
num_examples: 4
- name: test
num_bytes: 1272
num_examples: 4
- name: train
num_bytes: 3779
num_examples: 20
download_size: 11631
dataset_size: 5942
---
# Dataset Card for "MULTI_VALUE_wnli_existential_you_have"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Base-3B-v1 | ---
pretty_name: Evaluation run of togethercomputer/RedPajama-INCITE-Base-3B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [togethercomputer/RedPajama-INCITE-Base-3B-v1](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-3B-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 122 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the agregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Base-3B-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T12:53:13.989201](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Base-3B-v1/blob/main/results_2023-10-17T12-53-13.989201.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.0003314581465219258,\n \"f1\": 0.04903838087248335,\n\
\ \"f1_stderr\": 0.0012101494030918615,\n \"acc\": 0.33004332884746224,\n\
\ \"acc_stderr\": 0.008268314684144315\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219258,\n\
\ \"f1\": 0.04903838087248335,\n \"f1_stderr\": 0.0012101494030918615\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01288855193328279,\n \
\ \"acc_stderr\": 0.0031069012664996735\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6471981057616417,\n \"acc_stderr\": 0.013429728101788956\n\
\ }\n}\n```"
repo_url: https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-3B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T12_53_13.989201
path:
- '**/details_harness|drop|3_2023-10-17T12-53-13.989201.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T12-53-13.989201.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T12_53_13.989201
path:
- '**/details_harness|gsm8k|5_2023-10-17T12-53-13.989201.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T12-53-13.989201.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T12_53_13.989201
path:
- '**/details_harness|winogrande|5_2023-10-17T12-53-13.989201.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T12-53-13.989201.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:41:49.693075.parquet'
- config_name: results
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- results_2023-07-19T15:11:56.441864.parquet
- split: 2023_08_28T20_41_49.693075
path:
- results_2023-08-28T20:41:49.693075.parquet
- split: 2023_10_17T12_53_13.989201
path:
- results_2023-10-17T12-53-13.989201.parquet
- split: latest
path:
- results_2023-10-17T12-53-13.989201.parquet
---
# Dataset Card for Evaluation run of togethercomputer/RedPajama-INCITE-Base-3B-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-3B-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [togethercomputer/RedPajama-INCITE-Base-3B-v1](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-3B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Base-3B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T12:53:13.989201](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Base-3B-v1/blob/main/results_2023-10-17T12-53-13.989201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219258,
"f1": 0.04903838087248335,
"f1_stderr": 0.0012101494030918615,
"acc": 0.33004332884746224,
"acc_stderr": 0.008268314684144315
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219258,
"f1": 0.04903838087248335,
"f1_stderr": 0.0012101494030918615
},
"harness|gsm8k|5": {
"acc": 0.01288855193328279,
"acc_stderr": 0.0031069012664996735
},
"harness|winogrande|5": {
"acc": 0.6471981057616417,
"acc_stderr": 0.013429728101788956
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/raiden_shogun_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of raiden_shogun/雷電将軍/雷电将军 (Genshin Impact)
This is the dataset of raiden_shogun/雷電将軍/雷电将军 (Genshin Impact), containing 500 images and their tags.
The core tags of this character are `long_hair, purple_hair, purple_eyes, breasts, mole, mole_under_eye, large_breasts, hair_ornament, braid, very_long_hair, braided_ponytail, hair_flower`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:---------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.35 GiB | [Download](https://huggingface.co/datasets/CyberHarem/raiden_shogun_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 1.07 GiB | [Download](https://huggingface.co/datasets/CyberHarem/raiden_shogun_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1392 | 2.15 GiB | [Download](https://huggingface.co/datasets/CyberHarem/raiden_shogun_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/raiden_shogun_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, cleavage, solo, thighs, looking_at_viewer, blush, navel, stomach, black_bra, collarbone, black_panties, underwear_only, bare_shoulders, closed_mouth, black_thighhighs, flower, lace-trimmed_bra, purple_bra, purple_panties, bare_arms, blunt_bangs, parted_lips |
| 1 | 8 |  |  |  |  |  | 1girl, alternate_costume, looking_at_viewer, solo, cleavage, parted_lips, blush, long_sleeves, off_shoulder, thighs, white_shirt, black_bra, collarbone, black_choker, black_panties, black_pantyhose, earrings, flower, navel, panties_under_pantyhose, bare_shoulders, black_thighhighs, jacket, open_shirt, stomach, thighband_pantyhose |
| 2 | 8 |  |  |  |  |  | 1girl, alternate_costume, collared_shirt, contemporary, office_lady, pencil_skirt, solo, white_shirt, black_pantyhose, black_skirt, blush, looking_at_viewer, thighs, id_card, lanyard, nail_polish, purple_nails, cleavage, sitting, thighband_pantyhose, open_mouth, blurry, choker, holding, indoors, long_sleeves, paper, purple_flower, underwear |
| 3 | 22 |  |  |  |  |  | 1girl, cleavage, musou_isshin_(genshin_impact), solo, wide_sleeves, bridal_gauntlets, mitsudomoe_(shape), purple_kimono, electricity, looking_at_viewer, obijime, human_scabbard, long_sleeves, obiage, shoulder_armor, purple_nails, tassel, closed_mouth, drawing_sword, purple_flower, purple_thighhighs, ribbon, thighs, black_thighhighs, parted_lips |
| 4 | 6 |  |  |  |  |  | 1girl, blush, bridal_gauntlets, cleavage, long_sleeves, looking_at_viewer, obi, off_shoulder, purple_kimono, shoulder_armor, solo, thighs, wide_sleeves, cowboy_shot, flower, purple_thighhighs, blurry_background, closed_mouth, shrug_(clothing), simple_background, white_background |
| 5 | 10 |  |  |  |  |  | 1girl, alternate_costume, bare_shoulders, solo, looking_at_viewer, china_dress, closed_mouth, covered_navel, sleeveless_dress, thighs, blush, cowboy_shot, purple_flower, hand_up, purple_dress, single_braid, black_dress, cleavage_cutout, elbow_gloves, pelvic_curtain, simple_background, white_background, black_thighhighs, earrings, purple_nails, side_slit, tassel |
| 6 | 19 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, bare_shoulders, navel, stomach, thighs, alternate_costume, halterneck, water, outdoors, parted_lips, wet, blush, collarbone, black_bikini, bare_arms, side-tie_bikini_bottom, sitting, string_bikini, beach, single_braid, blue_sky, day, ocean, purple_flower, smile |
| 7 | 9 |  |  |  |  |  | 1girl, backless_dress, bare_shoulders, solo, alternate_costume, ass, bare_back, from_behind, looking_at_viewer, black_dress, earrings, looking_back, median_furrow, sideboob, thighs, bare_arms, closed_mouth, blush, purple_flower, sleeveless_dress, standing, back_tattoo, butt_crack, thigh_strap |
| 8 | 10 |  |  |  |  |  | 1girl, alternate_costume, looking_at_viewer, playboy_bunny, rabbit_ears, solo, bare_shoulders, black_leotard, cleavage, detached_collar, fake_animal_ears, strapless_leotard, blush, wrist_cuffs, highleg_leotard, thighs, black_pantyhose, brown_pantyhose, closed_mouth, covered_navel, simple_background, white_background, black_bowtie, black_hairband, collarbone, cowboy_shot, parted_lips, rabbit_tail |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | solo | thighs | looking_at_viewer | blush | navel | stomach | black_bra | collarbone | black_panties | underwear_only | bare_shoulders | closed_mouth | black_thighhighs | flower | lace-trimmed_bra | purple_bra | purple_panties | bare_arms | blunt_bangs | parted_lips | alternate_costume | long_sleeves | off_shoulder | white_shirt | black_choker | black_pantyhose | earrings | panties_under_pantyhose | jacket | open_shirt | thighband_pantyhose | collared_shirt | contemporary | office_lady | pencil_skirt | black_skirt | id_card | lanyard | nail_polish | purple_nails | sitting | open_mouth | blurry | choker | holding | indoors | paper | purple_flower | underwear | musou_isshin_(genshin_impact) | wide_sleeves | bridal_gauntlets | mitsudomoe_(shape) | purple_kimono | electricity | obijime | human_scabbard | obiage | shoulder_armor | tassel | drawing_sword | purple_thighhighs | ribbon | obi | cowboy_shot | blurry_background | shrug_(clothing) | simple_background | white_background | china_dress | covered_navel | sleeveless_dress | hand_up | purple_dress | single_braid | black_dress | cleavage_cutout | elbow_gloves | pelvic_curtain | side_slit | halterneck | water | outdoors | wet | black_bikini | side-tie_bikini_bottom | string_bikini | beach | blue_sky | day | ocean | smile | backless_dress | ass | bare_back | from_behind | looking_back | median_furrow | sideboob | standing | back_tattoo | butt_crack | thigh_strap | playboy_bunny | rabbit_ears | black_leotard | detached_collar | fake_animal_ears | strapless_leotard | wrist_cuffs | highleg_leotard | brown_pantyhose | black_bowtie | black_hairband | rabbit_tail |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-------|:---------|:--------------------|:--------|:--------|:----------|:------------|:-------------|:----------------|:-----------------|:-----------------|:---------------|:-------------------|:---------|:-------------------|:-------------|:-----------------|:------------|:--------------|:--------------|:--------------------|:---------------|:---------------|:--------------|:---------------|:------------------|:-----------|:--------------------------|:---------|:-------------|:----------------------|:-----------------|:---------------|:--------------|:---------------|:--------------|:----------|:----------|:--------------|:---------------|:----------|:-------------|:---------|:---------|:----------|:----------|:--------|:----------------|:------------|:--------------------------------|:---------------|:-------------------|:---------------------|:----------------|:--------------|:----------|:-----------------|:---------|:-----------------|:---------|:----------------|:--------------------|:---------|:------|:--------------|:--------------------|:-------------------|:--------------------|:-------------------|:--------------|:----------------|:-------------------|:----------|:---------------|:---------------|:--------------|:------------------|:---------------|:-----------------|:------------|:-------------|:--------|:-----------|:------|:---------------|:-------------------------|:----------------|:--------|:-----------|:------|:--------|:--------|:-----------------|:------|:------------|:--------------|:---------------|:----------------|:-----------|:-----------|:--------------|:-------------|:--------------|:----------------|:--------------|:----------------|:------------------|:-------------------|:--------------------|:--------------|:------------------|:------------------|:---------------|:-----------------|:--------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | X | | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | | | | | | X | X | | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 22 |  |  |  |  |  | X | X | X | X | X | | | | | | | | | X | X | | | | | | | X | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | X | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | X | | | | | X | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 10 |  |  |  |  |  | X | | X | X | X | X | | | | | | | X | X | X | | | | | | | | X | | | | | | X | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | X | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | | | X | | | | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 9 |  |  |  |  |  | X | | X | X | X | X | | | | | | | X | X | | | | | | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 8 | 10 |  |  |  |  |  | X | X | X | X | X | X | | | | X | | | X | X | | | | | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
alshahri/xauusd-h1-bid-2019-01-01-2023-05-30 | ---
license: other
---
|
CyberHarem/ninian_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ninian (Fire Emblem)
This is the dataset of ninian (Fire Emblem), containing 388 images and their tags.
The core tags of this character are `long_hair, blue_hair, red_eyes, hair_ornament, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 388 | 458.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninian_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 388 | 280.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninian_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 811 | 533.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninian_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 388 | 414.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninian_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 811 | 717.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninian_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ninian_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, bare_shoulders, dress, looking_at_viewer, simple_background, smile, solo, cape, full_body, white_background |
| 1 | 11 |  |  |  |  |  | 1girl, bare_shoulders, solo, looking_at_viewer, smile, cape, blue_dress, blush |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, dress, looking_at_viewer, solo, white_background, simple_background, smile |
| 3 | 43 |  |  |  |  |  | 1girl, bare_shoulders, bride, wedding_dress, smile, solo, white_dress, bridal_veil, flower, looking_at_viewer, bouquet, gloves, strapless_dress |
| 4 | 8 |  |  |  |  |  | 1girl, bangs, bare_shoulders, full_body, long_dress, solo, floating_object, medium_breasts, open_mouth, shiny_hair, gradient_clothes, stone, turtleneck, white_background, aqua_hair, blue_dress, looking_at_viewer, snowflakes, cape, dark_aura, glowing_eyes, simple_background, transparent_background |
| 5 | 8 |  |  |  |  |  | 1boy, 1girl, hetero, blush, penis, sex, solo_focus, nipples, open_mouth, vaginal, completely_nude, spread_legs, cum_in_pussy, large_breasts, lying, missionary, mosaic_censoring |
| 6 | 15 |  |  |  |  |  | hetero, multiple_boys, multiple_penises, 1girl, nipples, solo_focus, vaginal, large_breasts, gangbang, cum_in_pussy, mosaic_censoring, double_handjob, torn_clothes, blush, bukkake, dress, facial, fellatio, nude, rape, straddling |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | dress | looking_at_viewer | simple_background | smile | solo | cape | full_body | white_background | blue_dress | blush | bride | wedding_dress | white_dress | bridal_veil | flower | bouquet | gloves | strapless_dress | bangs | long_dress | floating_object | medium_breasts | open_mouth | shiny_hair | gradient_clothes | stone | turtleneck | aqua_hair | snowflakes | dark_aura | glowing_eyes | transparent_background | 1boy | hetero | penis | sex | solo_focus | nipples | vaginal | completely_nude | spread_legs | cum_in_pussy | large_breasts | lying | missionary | mosaic_censoring | multiple_boys | multiple_penises | gangbang | double_handjob | torn_clothes | bukkake | facial | fellatio | nude | rape | straddling |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:--------------------|:--------------------|:--------|:-------|:-------|:------------|:-------------------|:-------------|:--------|:--------|:----------------|:--------------|:--------------|:---------|:----------|:---------|:------------------|:--------|:-------------|:------------------|:-----------------|:-------------|:-------------|:-------------------|:--------|:-------------|:------------|:-------------|:------------|:---------------|:-------------------------|:-------|:---------|:--------|:------|:-------------|:----------|:----------|:------------------|:--------------|:---------------|:----------------|:--------|:-------------|:-------------------|:----------------|:-------------------|:-----------|:-----------------|:---------------|:----------|:---------|:-----------|:-------|:-------|:-------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | | X | | X | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 43 |  |  |  |  |  | X | X | | X | | X | X | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | X | | X | X | | X | X | X | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 6 | 15 |  |  |  |  |  | X | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | X | X | | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
AlcNdr/AlcVoice | ---
license: unknown
---
|
CyberHarem/yelan_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yelan/夜蘭/夜兰 (Genshin Impact)
This is the dataset of yelan/夜蘭/夜兰 (Genshin Impact), containing 500 images and their tags.
The core tags of this character are `blue_hair, breasts, short_hair, green_eyes, large_breasts, diagonal_bangs, bob_cut, multicolored_hair, mole, black_hair, earrings, mole_on_breast, tassel`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:---------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.34 GiB | [Download](https://huggingface.co/datasets/CyberHarem/yelan_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 1.06 GiB | [Download](https://huggingface.co/datasets/CyberHarem/yelan_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1388 | 2.11 GiB | [Download](https://huggingface.co/datasets/CyberHarem/yelan_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yelan_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, armpits, cleavage, dice, elbow_gloves, fur_trim, jacket, jewelry, looking_at_viewer, sleeveless, smile, solo, upper_body, bare_shoulders, blunt_bangs, parted_lips, black_gloves |
| 1 | 7 |  |  |  |  |  | 1girl, armpits, arms_up, bare_shoulders, black_gloves, blue_dress, cleavage, dice, fur-trimmed_jacket, looking_at_viewer, sleeveless, smile, solo, white_jacket, jacket_on_shoulders, mismatched_gloves, vision_(genshin_impact), clothing_cutout, parted_lips, bracelet, pelvic_curtain, single_elbow_glove, cowboy_shot, white_gloves |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, blue_dress, bracelet, cleavage, dice, elbow_gloves, fingerless_gloves, fur-trimmed_jacket, jacket_on_shoulders, looking_at_viewer, mismatched_gloves, parted_lips, smile, solo, white_gloves, white_jacket, pelvic_curtain, sleeveless, vision_(genshin_impact), armpits, arm_up, simple_background, white_background |
| 3 | 6 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, fake_animal_ears, jewelry, looking_at_viewer, playboy_bunny, rabbit_ears, solo, thighs, armpits, black_pantyhose, blue_leotard, fur-trimmed_jacket, highleg_leotard, sitting, smile, arms_up, parted_lips, white_jacket, arms_behind_head, black_gloves, brown_pantyhose, choker, dice, high_heels, single_elbow_glove, strapless_leotard |
| 4 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, thighs, cleavage, navel, stomach, blue_sky, outdoors, bare_shoulders, day, water, wet, cloud, cowboy_shot, smile, armpits, arms_up, blue_bikini, parted_lips, choker, dice, jewelry, ocean, blush, collarbone |
| 5 | 7 |  |  |  |  |  | 1girl, cleavage, collarbone, jewelry, looking_at_viewer, smile, solo, thighs, underwear_only, bare_shoulders, panties, armpits, arms_up, black_bra, braid, dice, navel, stomach, black_gloves, black_thighhighs, blush, choker, elbow_gloves, parted_lips |
| 6 | 6 |  |  |  |  |  | 1girl, alternate_costume, bare_shoulders, looking_at_viewer, midriff, navel, smile, solo, stomach, cleavage, collarbone, armpits, blush, crop_top, parted_lips, short_shorts, sports_bra, thighs, arm_up, cowboy_shot, patreon_username |
| 7 | 5 |  |  |  |  |  | 1girl, alternate_costume, looking_at_viewer, sleeveless_shirt, solo, armpits, arms_up, bare_shoulders, blush, bracelet, contemporary, office_lady, parted_lips, pencil_skirt, arms_behind_head, black_pantyhose, black_skirt, blunt_bangs, cleavage, collared_shirt, gradient_hair, indoors, sitting, smile, two-tone_hair, white_shirt, brown_pantyhose, cowboy_shot, dice, high-waist_skirt, id_card, lanyard, office_chair, one_eye_closed, shirt_tucked_in, swivel_chair, window_blinds |
| 8 | 6 |  |  |  |  |  | 1girl, belt, crop_top, looking_at_viewer, midriff, navel, smile, solo, alternate_costume, black_pants, long_sleeves, open_clothes, white_shirt, chain, holding, jewelry, sleeveless_shirt, stomach, armpits, bag, bare_shoulders, beret, black_headwear, coat, parted_lips, white_jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | armpits | cleavage | dice | elbow_gloves | fur_trim | jacket | jewelry | looking_at_viewer | sleeveless | smile | solo | upper_body | bare_shoulders | blunt_bangs | parted_lips | black_gloves | arms_up | blue_dress | fur-trimmed_jacket | white_jacket | jacket_on_shoulders | mismatched_gloves | vision_(genshin_impact) | clothing_cutout | bracelet | pelvic_curtain | single_elbow_glove | cowboy_shot | white_gloves | fingerless_gloves | arm_up | simple_background | white_background | fake_animal_ears | playboy_bunny | rabbit_ears | thighs | black_pantyhose | blue_leotard | highleg_leotard | sitting | arms_behind_head | brown_pantyhose | choker | high_heels | strapless_leotard | navel | stomach | blue_sky | outdoors | day | water | wet | cloud | blue_bikini | ocean | blush | collarbone | underwear_only | panties | black_bra | braid | black_thighhighs | alternate_costume | midriff | crop_top | short_shorts | sports_bra | patreon_username | sleeveless_shirt | contemporary | office_lady | pencil_skirt | black_skirt | collared_shirt | gradient_hair | indoors | two-tone_hair | white_shirt | high-waist_skirt | id_card | lanyard | office_chair | one_eye_closed | shirt_tucked_in | swivel_chair | window_blinds | belt | black_pants | long_sleeves | open_clothes | chain | holding | bag | beret | black_headwear | coat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-----------|:-------|:---------------|:-----------|:---------|:----------|:--------------------|:-------------|:--------|:-------|:-------------|:-----------------|:--------------|:--------------|:---------------|:----------|:-------------|:---------------------|:---------------|:----------------------|:--------------------|:--------------------------|:------------------|:-----------|:-----------------|:---------------------|:--------------|:---------------|:--------------------|:---------|:--------------------|:-------------------|:-------------------|:----------------|:--------------|:---------|:------------------|:---------------|:------------------|:----------|:-------------------|:------------------|:---------|:-------------|:--------------------|:--------|:----------|:-----------|:-----------|:------|:--------|:------|:--------|:--------------|:--------|:--------|:-------------|:-----------------|:----------|:------------|:--------|:-------------------|:--------------------|:----------|:-----------|:---------------|:-------------|:-------------------|:-------------------|:---------------|:--------------|:---------------|:--------------|:-----------------|:----------------|:----------|:----------------|:--------------|:-------------------|:----------|:----------|:---------------|:-----------------|:------------------|:---------------|:----------------|:-------|:--------------|:---------------|:---------------|:--------|:----------|:------|:--------|:-----------------|:-------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | | | | | X | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | | | | X | X | X | X | | X | | X | X | | X | X | X | X | X | X | | X | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | | | | X | X | | X | X | | X | | X | X | X | | X | X | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 12 |  |  |  |  |  | X | X | X | X | | | | X | X | | X | X | | X | | X | | X | | | | | | | | | | | X | | | | | | | | | X | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | X | X | X | X | | | X | X | | X | X | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | X | X | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | X | X | | | | | | X | | X | X | | X | | X | | | | | | | | | | | | | X | | | X | | | | | | X | | | | | | | | | | X | X | | | | | | | | | X | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | X | X | | | | | X | | X | X | | X | X | X | | X | | | | | | | | X | | | X | | | | | | | | | | X | | | X | X | X | | | | | | | | | | | | | | X | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | X | | | | | | X | X | | X | X | | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | X | X | X | | | | X | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
bot-yaya/EN_PARAGRAPH_GPT_JOINED | ---
dataset_info:
features:
- name: record
dtype: string
- name: raw_text
dtype: string
- name: is_hard_linebreak
sequence: bool
splits:
- name: train
num_bytes: 6311655
num_examples: 196
download_size: 3088802
dataset_size: 6311655
---
# Dataset Card for "EN_PARAGRAPH_GPT_JOINED"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
songlab/deprecated-human_variants | ---
license: mit
tags:
- dna
- variant-effect-prediction
- biology
- genomics
---
# Human variants
A curated set of variants from three sources: ClinVar, COSMIC, OMIM and gnomAD.
Predictions for methods benchmarked in GPN-MSA paper can be [downloaded from here](https://huggingface.co/datasets/songlab/human_variants/resolve/main/variants_and_preds.parquet).
Functional annotations can be [downloaded from here](https://huggingface.co/datasets/songlab/human_variants/resolve/main/functional_annotations.zip).
For more information check out our [paper](https://doi.org/10.1101/2023.10.10.561776) and [repository](https://github.com/songlab-cal/gpn).
## Data sources
**ClinVar**:
Missense variants considered "Pathogenic" by human labelers.
**COSMIC**:
Somatic missense variants with a frequency at least 0.1% in cancer samples (whole-genome and whole-exome sequencing only).
**OMIM**:
Regulatory variants considered "Pathogenic" by human labelers, curated in [this paper](https://doi.org/10.1016/j.ajhg.2016.07.005).
**gnomAD**:
All common variants (MAF > 5%) as well as an equally-sized subset of rare variants (MAC=1). Only autosomes are included.
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("songlab/human_variants", split="test")
```
Subset - ClinVar Pathogenic vs. gnomAD common (missense) (can specify `num_proc` to speed up):
```python
dataset = dataset.filter(lambda v: v["source"]=="ClinVar" or (v["label"]=="Common" and "missense" in v["consequence"]))
```
Subset - COSMIC frequent vs. gnomAD common (missense):
```python
dataset = dataset.filter(lambda v: v["source"]=="COSMIC" or (v["label"]=="Common" and "missense" in v["consequence"]))
```
Subset - OMIM Pathogenic vs. gnomAD common (regulatory):
```python
cs = ["5_prime_UTR", "upstream_gene", "intergenic", "3_prime_UTR", "non_coding_transcript_exon"]
dataset = dataset.filter(lambda v: v["source"]=="OMIM" or (v["label"]=="Common" and "missense" not in v["consequence"] and any([c in v["consequence"] for c in cs])))
```
Subset - gnomAD rare vs. gnomAD common:
```python
dataset = dataset.filter(lambda v: v["source"]=="gnomAD")
``` |
saurabh1896/OMR-forms | ---
dataset_info:
features:
- name: image
dtype: image
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 8632972.0
num_examples: 14
- name: test
num_bytes: 1629831.0
num_examples: 4
download_size: 7181972
dataset_size: 10262803.0
---
# Dataset Card for "OMR-forms"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_98 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1256867856.0
num_examples: 244908
download_size: 1287320763
dataset_size: 1256867856.0
---
# Dataset Card for "chunk_98"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Szym-on/testowy | ---
license: mit
configs:
- config_name: default
data_files:
- split: test
path: "test_data.csv"
- split: train
path: "test_data.csv"
extra_gated_prompt: >-
You agree to not use the dataset to conduct experiments that cause harm to
human subjects.
extra_gated_heading: Condition
extra_gated_button_content: Agree and continue
split_sizes:
train: 70%
validation: 15%
test: 15%
extra_gated_fields:
Company: text
Country: country
Specific date: date_picker
I want to use this dataset for:
type: select
options:
- Research
- Education
- label: Other
value: other
I agree to use this dataset for non-commercial use ONLY: checkbox
pretty_name: Teścik
size_categories:
- 10K<n<100K
--- |
Rasu23/iapp_wiki_qa_squad_cleaned | ---
dataset_info:
features:
- name: question_id
dtype: string
- name: article_id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
dtype: string
- name: answers_text
dtype: string
splits:
- name: train
num_bytes: 16985229
num_examples: 5745
- name: validation
num_bytes: 2240237
num_examples: 740
- name: test
num_bytes: 2153736
num_examples: 739
download_size: 3115286
dataset_size: 21379202
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
snorkelai/snorkel-curated-instruction-tuning | ---
license: apache-2.0
task_categories:
- question-answering
- text-generation
language:
- en
size_categories:
- 10K<n<100K
---
***<p style="font-size: 20px">Please check out our Blog Post - [How we built a better GenAI with programmatic data development](https://snorkel.ai/how-we-built-better-genai-with-programmatic-data-development/) for more details!</p>***
## Summary
`snorkel-curated-instruction-tuning` is a curated dataset that consists of high-quality instruction-response pairs.
These pairs were programmatically filtered with weak supervision from open-source datasets [Databricks Dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k),
[Open Assistant](https://huggingface.co/datasets/OpenAssistant/oasst1),
and [Helpful Instructions](https://huggingface.co/datasets/HuggingFaceH4/helpful_instructions).
To enhance the dataset, we also programmatically classified each instruction based on the InstructGPT paper.
For a more comprehensive understanding of our methodology, please visit our [blog](https://snorkel.ai/how-we-built-better-genai-with-programmatic-data-development/).
## Dataset Overview & Methodology
Instruction tuning is an important step in developing effective [large language models (LLMs)](https://snorkel.ai/large-language-models-llms/) for generative AI tasks.
While proprietary datasets have been used by LLM-backed chatbots, the open-source community has created similar datasets accessible to everyone.
However, the quality of responses collected by volunteers has been inconsistent, affecting the quality of open-source models. Furthermore, there is currently no standard classification of instructions across datasets (many lack classification altogether), which can complicate measurements of instruction diversity when compiling from multiple sources.
Snorkel, with its expertise in converting noisy signals into high-quality supervision, addressed this issue by programmatically scoring, sampling, and filtering open-source datasets.
The curated dataset and methodology are now available for public use.
Please refer to our [blog](https://snorkel.ai/how-we-built-better-genai-with-programmatic-data-development/) for more details on methods and evaluation.
## File descriptions
- `snorkel_curated_11k.jsonl`: 11k high-quality instruction-response pair selected from the mentioned open-source dataset. This is then used to instruction-tune the [snorkelai/RedPajama-7B-Chat-Curated](https://huggingface.co/snorkelai/RedPajama-7B-Chat-Curated/).
- `snorkel_hold_out_set.jsonl`: A hold-out set for evaluation, comparing human preferences between models.
## Intended Uses
- Instruction-tuning LLMs
For more detailed information, please refer to our blog post available at [How we built a better GenAI with programmatic data development](snorkel.ai/how-we-built-a-better-genai-with-programmatic-data-development).
## License/Attribution
**Copyright (2023) Snorkel AI, Inc.** This dataset was developed at [Snorkel AI](https://snorkel.ai/) and its use is subject to the Apache 2.0 license.
This work comes with the collaboration with Together Computer in releasing the [snorkelai/RedPajama-7B-Chat-Curated](https://huggingface.co/snorkelai/RedPajama-7B-Chat-Curated/) model.
Please refer to the licenses of the data subsets you use.
- [Open Assistant](https://huggingface.co/datasets/OpenAssistant/oasst1) is under Apache 2.0 license.
- [Helpful Instructions](https://huggingface.co/datasets/HuggingFaceH4/helpful_instructions) is under Apache 2.0 license.
- [Databricks Dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) is under CC BY-SA 3.0 license.
Certain categories of material in the dataset include materials from the following sources, licensed under the CC BY-SA 3.0 license:
Wikipedia (various pages) - https://www.wikipedia.org/ Copyright © Wikipedia editors and contributors.
Databricks (https://www.databricks.com) Copyright © Databricks
## Language
English
## Version
Version: 1.0
To cite this dataset, please use:
```
@software{snorkel2023instructiontuning,
author = {Snorkel AI},
title = {Applying programmatic data development to Generative AI with Snorkel},
month = June,
year = 2023,
url = {https://huggingface.co/datasets/snorkelai/snorkel-curated-instruction-tuning}
}
```
**Owner: Snorkel AI, Inc.**
## Community
Join us on [Snorkel AI Slack](snorkel.ai/slack) |
result-kand2-sdxl-wuerst-karlo/634fb531 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 273
num_examples: 10
download_size: 1461
dataset_size: 273
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "634fb531"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DAYSTOSOLVE/MasDoc | ---
license: mit
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_18 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1439875024.0
num_examples: 282772
download_size: 1466154303
dataset_size: 1439875024.0
---
# Dataset Card for "chunk_18"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tr416/2k_mixed_bc_literal | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2086723
num_examples: 1860
download_size: 1097444
dataset_size: 2086723
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "2k_mixed_bc_literal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TonyZero/MR-rt-polarity | ---
license: apache-2.0
---
|
marcelomoreno26/geoguessr | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 3341095987.16
num_examples: 25160
- name: validation
num_bytes: 748548716.14
num_examples: 5372
- name: test
num_bytes: 760439383.18
num_examples: 5445
download_size: 5129381297
dataset_size: 4850084086.48
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
sampath017/plants | ---
license: gpl-3.0
task_categories:
- image-classification
language:
- en
pretty_name: 'plants images '
size_categories:
- n<1K
--- |
ashleybishop/tomi_nil_inference_v2 | ---
dataset_info:
features:
- name: label
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2222149
num_examples: 5994
- name: validation
num_bytes: 2225963
num_examples: 5994
- name: test
num_bytes: 2216602
num_examples: 5994
download_size: 842841
dataset_size: 6664714
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Kaludi/food-category-classification-v2.0 | ---
task_categories:
- image-classification
---
# Dataset for project: food-category-classification-v2.0
## Dataset Description
This dataset for project food-category-classification-v2.0 was scraped with the help of a bulk google image downloader.
## Dataset Structure
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['Bread', 'Dairy', 'Dessert', 'Egg', 'Fried Food', 'Fruit', 'Meat', 'Noodles', 'Rice', 'Seafood', 'Soup', 'Vegetable'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follows:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 1200 |
| valid | 300 |
|
mesolitica/snapshot-twitter-2022-09-03 | ---
language:
- ms
---
# Snapshot Twitter
**We no longer able to snapshot due to API changes**.
## description
1. minimum timestamp, 2022-04-17T16:30:07.000Z2.
2. maximum timestamp, 2022-09-03T09:23:52.000Z
3. 7075025 rows
4. full attributes,
```json
{
"datetime": "2022-04-18T05:57:04",
"datetime_gmt8": "2022-04-18T13:57:04",
"data_text": "kekal halal kak https://t.co/YHKqszqPnS",
"body": "kekal halal kak https://t.co/YHKqszqPnS",
"screen_name": "Luke_Sebastian2",
"followers_count": 10413,
"friends_count": 72,
"listed_count": 6,
"favourites_count": 1494,
"statuses_count": 948,
"quoted_status_text": "NULL",
"lang": "in",
"retweet": "false",
"retweet_text": "NULL",
"retweet_text_full": "NULL",
"retweet_count": 0,
"retweet_detail": {},
"quote_count": 0,
"favorite_count": 0,
"reply_count": 0,
"id_str": "1515932406368202753",
"tweet": {
"created_at": "Mon Apr 18 05:57:04 +0000 2022",
"id": 1515932406368202800,
"id_str": "1515932406368202753",
"text": "kekal halal kak😏🤫 https://t.co/YHKqszqPnS",
"display_text_range": [
0,
17
],
"source": "<a href=\"http://twitter.com/download/android\" rel=\"nofollow\">Twitter for Android</a>",
"truncated": false,
"in_reply_to_status_id": null,
"in_reply_to_status_id_str": null,
"in_reply_to_user_id": null,
"in_reply_to_user_id_str": null,
"in_reply_to_screen_name": null,
"user": {
"id": 1431086333024374800,
"id_str": "1431086333024374792",
"name": "☄ʟᴜᴋᴇ",
"screen_name": "Luke_Sebastian2",
"location": "Malaysia",
"url": "http://t.me/Luke_Alqamara",
"description": "|𝟮𝟬🍰|⚤|📚𝗧𝗼𝗽|🇮🇩|📌🇲🇾|Law Student💼|•𝐤𝐞𝐤𝐚𝐬𝐢𝐡𝐤𝐮:@Trevor_Louise1•|Dm me for endorsement/Collab and rates also📩!|•don't forget to smile😊•",
"translator_type": "none",
"protected": false,
"verified": false,
"followers_count": 10413,
"friends_count": 72,
"listed_count": 6,
"favourites_count": 1494,
"statuses_count": 948,
"created_at": "Fri Aug 27 02:49:28 +0000 2021",
"utc_offset": null,
"time_zone": null,
"geo_enabled": true,
"lang": null,
"contributors_enabled": false,
"is_translator": false,
"profile_background_color": "F5F8FA",
"profile_background_image_url": "",
"profile_background_image_url_https": "",
"profile_background_tile": false,
"profile_link_color": "1DA1F2",
"profile_sidebar_border_color": "C0DEED",
"profile_sidebar_fill_color": "DDEEF6",
"profile_text_color": "333333",
"profile_use_background_image": true,
"profile_image_url": "http://pbs.twimg.com/profile_images/1500850780823494658/snCdyeen_normal.jpg",
"profile_image_url_https": "https://pbs.twimg.com/profile_images/1500850780823494658/snCdyeen_normal.jpg",
"profile_banner_url": "https://pbs.twimg.com/profile_banners/1431086333024374792/1647061513",
"default_profile": true,
"default_profile_image": false,
"following": null,
"follow_request_sent": null,
"notifications": null,
"withheld_in_countries": []
},
"geo": null,
"coordinates": null,
"place": {
"id": "7b02fbddf4d9f2c6",
"url": "https://api.twitter.com/1.1/geo/id/7b02fbddf4d9f2c6.json",
"place_type": "city",
"name": "Kuala Lumpur City",
"full_name": "Kuala Lumpur City, Kuala Lumpur Federal Territory",
"country_code": "MY",
"country": "Malaysia",
"bounding_box": {
"type": "Polygon",
"coordinates": [
[
[
101.668232,
3.104906
],
[
101.668232,
3.192155
],
[
101.742378,
3.192155
],
[
101.742378,
3.104906
]
]
]
},
"attributes": {}
},
"contributors": null,
"is_quote_status": false,
"quote_count": 0,
"reply_count": 0,
"retweet_count": 0,
"favorite_count": 0,
"entities": {
"hashtags": [],
"urls": [],
"user_mentions": [],
"symbols": [],
"media": [
{
"id": 1515932334612107300,
"id_str": "1515932334612107268",
"indices": [
18,
41
],
"additional_media_info": {
"monetizable": false
},
"media_url": "http://pbs.twimg.com/ext_tw_video_thumb/1515932334612107268/pu/img/ak2K23DgNDDV-UCC.jpg",
"media_url_https": "https://pbs.twimg.com/ext_tw_video_thumb/1515932334612107268/pu/img/ak2K23DgNDDV-UCC.jpg",
"url": "https://t.co/YHKqszqPnS",
"display_url": "pic.twitter.com/YHKqszqPnS",
"expanded_url": "https://twitter.com/Luke_Sebastian2/status/1515932406368202753/video/1",
"type": "photo",
"sizes": {
"thumb": {
"w": 150,
"h": 150,
"resize": "crop"
},
"medium": {
"w": 540,
"h": 960,
"resize": "fit"
},
"small": {
"w": 383,
"h": 680,
"resize": "fit"
},
"large": {
"w": 540,
"h": 960,
"resize": "fit"
}
}
}
]
},
"extended_entities": {
"media": [
{
"id": 1515932334612107300,
"id_str": "1515932334612107268",
"indices": [
18,
41
],
"additional_media_info": {
"monetizable": false
},
"media_url": "http://pbs.twimg.com/ext_tw_video_thumb/1515932334612107268/pu/img/ak2K23DgNDDV-UCC.jpg",
"media_url_https": "https://pbs.twimg.com/ext_tw_video_thumb/1515932334612107268/pu/img/ak2K23DgNDDV-UCC.jpg",
"url": "https://t.co/YHKqszqPnS",
"display_url": "pic.twitter.com/YHKqszqPnS",
"expanded_url": "https://twitter.com/Luke_Sebastian2/status/1515932406368202753/video/1",
"type": "video",
"video_info": {
"aspect_ratio": [
9,
16
],
"duration_millis": 15232,
"variants": [
{
"bitrate": 632000,
"content_type": "video/mp4",
"url": "https://video.twimg.com/ext_tw_video/1515932334612107268/pu/vid/320x568/3gN3Udy0BrbU8HFr.mp4?tag=12"
},
{
"content_type": "application/x-mpegURL",
"url": "https://video.twimg.com/ext_tw_video/1515932334612107268/pu/pl/V6UZr3a49tZHwoia.m3u8?tag=12&container=fmp4"
},
{
"bitrate": 950000,
"content_type": "video/mp4",
"url": "https://video.twimg.com/ext_tw_video/1515932334612107268/pu/vid/480x852/CpA6Jht3IZjzh75X.mp4?tag=12"
},
{
"bitrate": 2176000,
"content_type": "video/mp4",
"url": "https://video.twimg.com/ext_tw_video/1515932334612107268/pu/vid/540x960/EdWN9mo8jIbA5PDM.mp4?tag=12"
}
]
},
"sizes": {
"thumb": {
"w": 150,
"h": 150,
"resize": "crop"
},
"medium": {
"w": 540,
"h": 960,
"resize": "fit"
},
"small": {
"w": 383,
"h": 680,
"resize": "fit"
},
"large": {
"w": 540,
"h": 960,
"resize": "fit"
}
}
}
]
},
"favorited": false,
"retweeted": false,
"possibly_sensitive": false,
"filter_level": "low",
"lang": "in",
"timestamp_ms": "1650261424997",
"ignore_lang": true
},
"type": "search"
}
```
5. stream filtered by geo boundary,
```python
stream.filter(
locations=[
99.8568959909,
0.8232449017,
119.5213933664,
7.2037547089,
]
)
``` |
naorm/dnrti-cyner-ner | ---
dataset_info:
features:
- name: Type
dtype: string
- name: Text
dtype: string
- name: Fixed Text
dtype: string
- name: Score
dtype: float64
- name: Original Sentence ID
dtype: int64
- name: Original Sentence
dtype: string
- name: Decoded Sentence
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4719345
num_examples: 11460
download_size: 1303936
dataset_size: 4719345
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nadinegp/Pharoh | ---
license: openrail
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_42 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1247203928.0
num_examples: 244934
download_size: 1268075494
dataset_size: 1247203928.0
---
# Dataset Card for "chunk_42"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_cognitivecomputations__dolphin-2.8-mistral-7b-v02 | ---
pretty_name: Evaluation run of cognitivecomputations/dolphin-2.8-mistral-7b-v02
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cognitivecomputations/dolphin-2.8-mistral-7b-v02](https://huggingface.co/cognitivecomputations/dolphin-2.8-mistral-7b-v02)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cognitivecomputations__dolphin-2.8-mistral-7b-v02\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-07T23:45:36.461249](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.8-mistral-7b-v02/blob/main/results_2024-04-07T23-45-36.461249.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6251524395498075,\n\
\ \"acc_stderr\": 0.03244455889660083,\n \"acc_norm\": 0.6294934152120081,\n\
\ \"acc_norm_stderr\": 0.0330967072320074,\n \"mc1\": 0.3353733170134639,\n\
\ \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.49576535581378267,\n\
\ \"mc2_stderr\": 0.014917885575473846\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n\
\ \"acc_norm\": 0.6271331058020477,\n \"acc_norm_stderr\": 0.01413117676013117\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6416052579167496,\n\
\ \"acc_stderr\": 0.004785488626807578,\n \"acc_norm\": 0.8379804819757021,\n\
\ \"acc_norm_stderr\": 0.00367715668784884\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395269,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395269\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642514,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642514\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"\
acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478926,\n\
\ \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478926\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871937,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871937\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059278,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059278\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8091743119266055,\n \"acc_stderr\": 0.016847676400091098,\n \"\
acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.016847676400091098\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159462,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159462\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.03351953879521269,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.03351953879521269\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7918263090676884,\n\
\ \"acc_stderr\": 0.014518592248904033,\n \"acc_norm\": 0.7918263090676884,\n\
\ \"acc_norm_stderr\": 0.014518592248904033\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41564245810055866,\n\
\ \"acc_stderr\": 0.01648278218750067,\n \"acc_norm\": 0.41564245810055866,\n\
\ \"acc_norm_stderr\": 0.01648278218750067\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242553,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242553\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.02584224870090217,\n\
\ \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.02584224870090217\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4595827900912647,\n\
\ \"acc_stderr\": 0.01272844606766998,\n \"acc_norm\": 0.4595827900912647,\n\
\ \"acc_norm_stderr\": 0.01272844606766998\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882536,\n \
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882536\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.0287951855742913,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.0287951855742913\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3353733170134639,\n\
\ \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.49576535581378267,\n\
\ \"mc2_stderr\": 0.014917885575473846\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235805\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.43745261561789234,\n \
\ \"acc_stderr\": 0.013664299060751915\n }\n}\n```"
repo_url: https://huggingface.co/cognitivecomputations/dolphin-2.8-mistral-7b-v02
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|arc:challenge|25_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|gsm8k|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hellaswag|10_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T23-45-36.461249.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T23-45-36.461249.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- '**/details_harness|winogrande|5_2024-04-07T23-45-36.461249.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-07T23-45-36.461249.parquet'
- config_name: results
data_files:
- split: 2024_04_07T23_45_36.461249
path:
- results_2024-04-07T23-45-36.461249.parquet
- split: latest
path:
- results_2024-04-07T23-45-36.461249.parquet
---
# Dataset Card for Evaluation run of cognitivecomputations/dolphin-2.8-mistral-7b-v02
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cognitivecomputations/dolphin-2.8-mistral-7b-v02](https://huggingface.co/cognitivecomputations/dolphin-2.8-mistral-7b-v02) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cognitivecomputations__dolphin-2.8-mistral-7b-v02",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-07T23:45:36.461249](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.8-mistral-7b-v02/blob/main/results_2024-04-07T23-45-36.461249.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6251524395498075,
"acc_stderr": 0.03244455889660083,
"acc_norm": 0.6294934152120081,
"acc_norm_stderr": 0.0330967072320074,
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.49576535581378267,
"mc2_stderr": 0.014917885575473846
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.6271331058020477,
"acc_norm_stderr": 0.01413117676013117
},
"harness|hellaswag|10": {
"acc": 0.6416052579167496,
"acc_stderr": 0.004785488626807578,
"acc_norm": 0.8379804819757021,
"acc_norm_stderr": 0.00367715668784884
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383887,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383887
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642514,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478926,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478926
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871937,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871937
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059278,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059278
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.016847676400091098,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.016847676400091098
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159462,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159462
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.03351953879521269,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.03351953879521269
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7918263090676884,
"acc_stderr": 0.014518592248904033,
"acc_norm": 0.7918263090676884,
"acc_norm_stderr": 0.014518592248904033
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41564245810055866,
"acc_stderr": 0.01648278218750067,
"acc_norm": 0.41564245810055866,
"acc_norm_stderr": 0.01648278218750067
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242553,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242553
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.02584224870090217,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.02584224870090217
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4595827900912647,
"acc_stderr": 0.01272844606766998,
"acc_norm": 0.4595827900912647,
"acc_norm_stderr": 0.01272844606766998
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882536,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882536
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.0287951855742913,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.0287951855742913
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764004,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764004
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.49576535581378267,
"mc2_stderr": 0.014917885575473846
},
"harness|winogrande|5": {
"acc": 0.7932123125493291,
"acc_stderr": 0.011382566829235805
},
"harness|gsm8k|5": {
"acc": 0.43745261561789234,
"acc_stderr": 0.013664299060751915
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
trungdang2901/dattta | ---
license: apache-2.0
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1711982328
num_examples: 256746
- name: validation
num_bytes: 187084076
num_examples: 28057
download_size: 454650047
dataset_size: 1899066404
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
CyberHarem/elysia_honkai3 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of elysia (Houkai 3rd)
This is the dataset of elysia (Houkai 3rd), containing 500 images and their tags.
The core tags of this character are `pink_hair, bangs, long_hair, pointy_ears, breasts, hair_ornament, blue_eyes, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.03 GiB | [Download](https://huggingface.co/datasets/CyberHarem/elysia_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 481.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elysia_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1316 | 1.05 GiB | [Download](https://huggingface.co/datasets/CyberHarem/elysia_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 870.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elysia_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1316 | 1.67 GiB | [Download](https://huggingface.co/datasets/CyberHarem/elysia_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/elysia_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, cleavage, elf, frills, looking_at_viewer, maid_headdress, smile, solo, white_gloves, official_alternate_costume, short_sleeves, enmaided, white_background, white_thighhighs, elbow_gloves, one_eye_closed, simple_background, heart_hands, maid_apron, ponytail |
| 1 | 7 |  |  |  |  |  | 1girl, closed_mouth, elf, simple_background, solo, white_background, looking_at_viewer, smile, cleavage, hair_between_eyes |
| 2 | 29 |  |  |  |  |  | 1girl, bare_shoulders, smile, solo, white_dress, white_gloves, looking_at_viewer, closed_mouth, pink_eyes, purple_eyes, elf, cleavage |
| 3 | 9 |  |  |  |  |  | 1girl, boots, looking_at_viewer, smile, solo, white_dress, white_footwear, white_gloves, bare_shoulders, full_body, pink_eyes, closed_mouth, shorts, very_long_hair, purple_eyes, holding, cleavage, staff |
| 4 | 25 |  |  |  |  |  | 1girl, elf, long_sleeves, smile, solo, looking_at_viewer, cleavage, thigh_boots, thighhighs, single_glove, black_gloves, black_shorts, closed_mouth, ponytail, asymmetrical_sleeves, simple_background, white_background |
| 5 | 26 |  |  |  |  |  | 1girl, solo, elf, looking_at_viewer, smile, white_bikini, cleavage, navel, frilled_bikini, outdoors, bikini_skirt, necklace, blue_sky, open_mouth, water |
| 6 | 5 |  |  |  |  |  | 1girl, collarbone, navel, nipples, smile, solo, completely_nude, looking_at_viewer, purple_eyes, blush, closed_mouth, pussy, very_long_hair, cleft_of_venus, cowboy_shot, elf, mosaic_censoring, one_eye_closed, pink_eyes, sitting, stomach |
| 7 | 12 |  |  |  |  |  | 1boy, 1girl, hetero, looking_at_viewer, nipples, solo_focus, blush, elf, pussy, navel, penis, sex, smile, spread_legs, vaginal, open_mouth, pov, completely_nude, cowgirl_position, girl_on_top, gloves, mosaic_censoring, on_back, sweat, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | elf | frills | looking_at_viewer | maid_headdress | smile | solo | white_gloves | official_alternate_costume | short_sleeves | enmaided | white_background | white_thighhighs | elbow_gloves | one_eye_closed | simple_background | heart_hands | maid_apron | ponytail | closed_mouth | hair_between_eyes | bare_shoulders | white_dress | pink_eyes | purple_eyes | boots | white_footwear | full_body | shorts | very_long_hair | holding | staff | long_sleeves | thigh_boots | thighhighs | single_glove | black_gloves | black_shorts | asymmetrical_sleeves | white_bikini | navel | frilled_bikini | outdoors | bikini_skirt | necklace | blue_sky | open_mouth | water | collarbone | nipples | completely_nude | blush | pussy | cleft_of_venus | cowboy_shot | mosaic_censoring | sitting | stomach | 1boy | hetero | solo_focus | penis | sex | spread_legs | vaginal | pov | cowgirl_position | girl_on_top | gloves | on_back | sweat | thighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:------|:---------|:--------------------|:-----------------|:--------|:-------|:---------------|:-----------------------------|:----------------|:-----------|:-------------------|:-------------------|:---------------|:-----------------|:--------------------|:--------------|:-------------|:-----------|:---------------|:--------------------|:-----------------|:--------------|:------------|:--------------|:--------|:-----------------|:------------|:---------|:-----------------|:----------|:--------|:---------------|:--------------|:-------------|:---------------|:---------------|:---------------|:-----------------------|:---------------|:--------|:-----------------|:-----------|:---------------|:-----------|:-----------|:-------------|:--------|:-------------|:----------|:------------------|:--------|:--------|:-----------------|:--------------|:-------------------|:----------|:----------|:-------|:---------|:-------------|:--------|:------|:--------------|:----------|:------|:-------------------|:--------------|:---------|:----------|:--------|:---------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | X | | X | X | | | | | X | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 29 |  |  |  |  |  | X | X | X | | X | | X | X | X | | | | | | | | | | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | X | | | X | | X | X | X | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 25 |  |  |  |  |  | X | X | X | | X | | X | X | | | | | X | | | | X | | | X | X | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 26 |  |  |  |  |  | X | X | X | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | X | | X | | X | X | | | | | | | | X | | | | | X | | | | X | X | | | | | X | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 7 | 12 |  |  |  |  |  | X | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | X | X | X | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
openai/webgpt_comparisons | ---
pretty_name: WebGPT Comparisons
---
# Dataset Card for WebGPT Comparisons
## Dataset Description
In the [WebGPT paper](https://arxiv.org/abs/2112.09332), the authors trained a reward model from human feedback.
They used the reward model to train a long form question answering model to align with human preferences.
This is the dataset of all comparisons that were marked as suitable for reward modeling by the end of the WebGPT project.
There are 19,578 comparisons in total.
Each example in the dataset contains a pair of model answers for a question, and the associated metadata.
Each answer has a preference score from humans that can be used to determine which of the two answers are better.
Overall, an example has the following fields:
* `question`: The text of the question, together with the name of the dataset from which it was taken and a unique ID.
* `quotes_0`: The extracts that the model found while browsing for `answer_0`, together with the title of the page on which the extract was found, constructed from the HTML title and domain name of the page.
* `answer_0`: The final answer that the model composed using `quotes_0`.
* `tokens_0`: The prefix that would have been given to the model in the final step of the episode to create `answer_0`, and the completion given by the model or human. The prefix is made up of the question and the quotes, with some truncation, and the completion is simply the answer. Both are tokenized using the GPT-2 tokenizer. The concatenation of the prefix and completion is the input used for reward modeling.
* `score_0`: The strength of the preference for `answer_0` over `answer_1` as a number from −1 to 1. It sums to 0 with `score_1`, and an answer is preferred if and only if its score is positive. For reward modeling, we treat scores of 0 as soft 50% labels, and all other scores as hard labels (using only their sign).
* `quotes_1`: The counterpart to `quotes_0`.
* `answer_1`: The counterpart to `answer_0`.
* `tokens_1`: The counterpart to `tokens_0`.
* `score_1`: The counterpart to `score_0`.
This information was found in Appendix K of the WebGPT paper.
## Citation Information
[https://arxiv.org/abs/2112.09332](https://arxiv.org/abs/2112.09332)
```
@inproceedings{nakano2021webgpt,
author = {Reiichiro Nakano and Jacob Hilton and Suchir Balaji and Jeff Wu and Long Ouyang and Christina Kim and Christopher Hesse and Shantanu Jain and Vineet Kosaraju and William Saunders and Xu Jiang and Karl Cobbe and Tyna Eloundou and Gretchen Krueger and Kevin Button and Matthew Knight and Benjamin Chess and John Schulman},
title = {WebGPT: Browser-assisted question-answering with human feedback},
booktitle = {arXiv},
year = 2021,
}
```
Dataset added to the Hugging Face Hub by [@Tristan](https://huggingface.co/Tristan) and [@natolambert](https://huggingface.co/natolambert) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.