datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
xzuyn/dalle-3_vs_sd-v1-5_dpo | ---
language:
- en
size_categories:
- n<1K
---
750 [DALL·E 3 images](https://huggingface.co/datasets/dataautogpt3/Dalle3) (the first 3 arrow files) paired with a Base SD v1.5 generated version as a rejected image.
Images are bytes encoded in base64 strings, so it can save in a jsonl. |
AgoraX/OpenImage-FNCall-50k | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: int64
- name: fn_call
dtype: string
- name: caption
dtype: string
splits:
- name: train
num_bytes: 16996772690.085
num_examples: 53329
download_size: 16958328059
dataset_size: 16996772690.085
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SachinKaushik/LLAMAV2InstructMaths | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: llamaV2Instruct
dtype: string
splits:
- name: train
num_bytes: 9277981
num_examples: 7473
- name: test
num_bytes: 1664206
num_examples: 1319
download_size: 5455789
dataset_size: 10942187
---
# Dataset Card for "LLAMAV2InstructMaths"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_nlpguy__T3QM7X | ---
pretty_name: Evaluation run of nlpguy/T3QM7X
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nlpguy/T3QM7X](https://huggingface.co/nlpguy/T3QM7X) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nlpguy__T3QM7X\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T15:09:54.363451](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__T3QM7X/blob/main/results_2024-03-24T15-09-54.363451.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6511116200650416,\n\
\ \"acc_stderr\": 0.03207313114421989,\n \"acc_norm\": 0.6501218526969689,\n\
\ \"acc_norm_stderr\": 0.03274914773740706,\n \"mc1\": 0.6303549571603427,\n\
\ \"mc1_stderr\": 0.016898180706973878,\n \"mc2\": 0.7801734369560391,\n\
\ \"mc2_stderr\": 0.013690740416496414\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7150170648464164,\n \"acc_stderr\": 0.013191348179838793,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710696\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7175861382194781,\n\
\ \"acc_stderr\": 0.004492535748097627,\n \"acc_norm\": 0.8913563035251942,\n\
\ \"acc_norm_stderr\": 0.003105556631739391\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0303883535518868,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0303883535518868\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834841,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834841\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n\
\ \"acc_stderr\": 0.016563829399047703,\n \"acc_norm\": 0.4312849162011173,\n\
\ \"acc_norm_stderr\": 0.016563829399047703\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n\
\ \"acc_stderr\": 0.012756161942523367,\n \"acc_norm\": 0.4765319426336376,\n\
\ \"acc_norm_stderr\": 0.012756161942523367\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6303549571603427,\n\
\ \"mc1_stderr\": 0.016898180706973878,\n \"mc2\": 0.7801734369560391,\n\
\ \"mc2_stderr\": 0.013690740416496414\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.850828729281768,\n \"acc_stderr\": 0.010012598805627297\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6997725549658832,\n \
\ \"acc_stderr\": 0.012625423152283035\n }\n}\n```"
repo_url: https://huggingface.co/nlpguy/T3QM7X
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|arc:challenge|25_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|gsm8k|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hellaswag|10_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-09-54.363451.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T15-09-54.363451.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- '**/details_harness|winogrande|5_2024-03-24T15-09-54.363451.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T15-09-54.363451.parquet'
- config_name: results
data_files:
- split: 2024_03_24T15_09_54.363451
path:
- results_2024-03-24T15-09-54.363451.parquet
- split: latest
path:
- results_2024-03-24T15-09-54.363451.parquet
---
# Dataset Card for Evaluation run of nlpguy/T3QM7X
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nlpguy/T3QM7X](https://huggingface.co/nlpguy/T3QM7X) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nlpguy__T3QM7X",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T15:09:54.363451](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__T3QM7X/blob/main/results_2024-03-24T15-09-54.363451.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6511116200650416,
"acc_stderr": 0.03207313114421989,
"acc_norm": 0.6501218526969689,
"acc_norm_stderr": 0.03274914773740706,
"mc1": 0.6303549571603427,
"mc1_stderr": 0.016898180706973878,
"mc2": 0.7801734369560391,
"mc2_stderr": 0.013690740416496414
},
"harness|arc:challenge|25": {
"acc": 0.7150170648464164,
"acc_stderr": 0.013191348179838793,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710696
},
"harness|hellaswag|10": {
"acc": 0.7175861382194781,
"acc_stderr": 0.004492535748097627,
"acc_norm": 0.8913563035251942,
"acc_norm_stderr": 0.003105556631739391
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0303883535518868,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0303883535518868
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834841,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834841
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.016563829399047703,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.016563829399047703
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4765319426336376,
"acc_stderr": 0.012756161942523367,
"acc_norm": 0.4765319426336376,
"acc_norm_stderr": 0.012756161942523367
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6303549571603427,
"mc1_stderr": 0.016898180706973878,
"mc2": 0.7801734369560391,
"mc2_stderr": 0.013690740416496414
},
"harness|winogrande|5": {
"acc": 0.850828729281768,
"acc_stderr": 0.010012598805627297
},
"harness|gsm8k|5": {
"acc": 0.6997725549658832,
"acc_stderr": 0.012625423152283035
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
sanjay920/single_function_call_oai_mistral_large | ---
dataset_info:
features:
- name: id
dtype: string
- name: tools
list:
- name: function
struct:
- name: description
dtype: string
- name: name
dtype: string
- name: parameters
struct:
- name: properties
struct:
- name: amount
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: amount_due
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: author
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: bill_amount
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: birth_date
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: birth_year
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: category
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: country
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: cuisine
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: customer_name
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: date_of_birth
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: destination
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: diet
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: discount_percentage
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: dob
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: due_date
struct:
- name: description
dtype: string
- name: format
dtype: string
- name: type
dtype: string
- name: email
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: encryption_algorithm
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: end_location
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: end_time
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: first_name
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: from_currency
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: genre
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: grades
struct:
- name: items
struct:
- name: properties
struct:
- name: course
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: credit_hours
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: grade
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: required
sequence: string
- name: type
dtype: string
- name: type
dtype: string
- name: height
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: include_numbers
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: include_special_characters
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: include_symbols
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: income
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: ingredients
struct:
- name: description
dtype: string
- name: items
struct:
- name: type
dtype: string
- name: type
dtype: string
- name: interest_rate
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: keyword
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: keywords
struct:
- name: description
dtype: string
- name: items
struct:
- name: type
dtype: string
- name: type
dtype: string
- name: last_name
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: length
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: loan_amount
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: loan_term
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: max
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: measurements
struct:
- name: properties
struct:
- name: length
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: width
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: required
sequence: string
- name: type
dtype: string
- name: message
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: metrics
struct:
- name: description
dtype: string
- name: items
struct:
- name: type
dtype: string
- name: type
dtype: string
- name: min
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: name
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: numbers
struct:
- name: description
dtype: string
- name: items
struct:
- name: type
dtype: string
- name: type
dtype: string
- name: options
struct:
- name: description
dtype: string
- name: items
struct:
- name: type
dtype: string
- name: type
dtype: string
- name: origin
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: original_price
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: principal
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: priority
struct:
- name: description
dtype: string
- name: enum
sequence: string
- name: type
dtype: string
- name: query
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: question
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: recipient
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: search_query
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: shape
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: start_location
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: start_time
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: stock_symbol
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: subject
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: symbol
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: task
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: task_name
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: tax_rate
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: text
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: time
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: tip_percentage
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: title
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: to_currency
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: website_url
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: weight
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: word
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: year
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: required
sequence: string
- name: type
dtype: string
- name: type
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
- name: oai_match_laplateforme
dtype: bool
- name: oai_match_azure
dtype: bool
- name: openai_response
dtype: string
- name: la_plateforme_mistral_large_response
dtype: string
- name: azure_mistral_large_response
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 72421
num_examples: 50
download_size: 140187
dataset_size: 72421
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lang-uk/recruitment-dataset-job-descriptions-ukrainian | ---
dataset_info:
features:
- name: Position
dtype: string
- name: Long Description
dtype: string
- name: Company Name
dtype: string
- name: Exp Years
dtype: string
- name: Primary Keyword
dtype: string
- name: English Level
dtype: string
- name: Published
dtype: string
- name: Long Description_lang
dtype: string
- name: id
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 83166918
num_examples: 27461
download_size: 40645342
dataset_size: 83166918
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
language:
- uk
size_categories:
- 10K<n<100K
---
# Djinni Dataset (Ukrainian Job Descriptions part)
## Overview
The [Djinni Recruitment Dataset](https://github.com/Stereotypes-in-LLMs/recruitment-dataset) (Ukrainian Job Descriptions part) contains 150,000 job descriptions and 230,000 anonymized candidate CVs, posted between 2020-2023 on the [Djinni](https://djinni.co/) IT job platform. The dataset includes samples in English and Ukrainian.
The dataset contains various attributes related to job descriptions, including position titles, job descriptions, company names, experience requirements, keywords, English proficiency levels, publication dates, language of job descriptions, and unique identifiers.
## Intended Use
The Djinni dataset is designed with versatility in mind, supporting a wide range of applications:
- **Recommender Systems and Semantic Search:** It serves as a key resource for enhancing job recommendation engines and semantic search functionalities, making the job search process more intuitive and tailored to individual preferences.
- **Advancement of Large Language Models (LLMs):** The dataset provides invaluable training data for both English and Ukrainian domain-specific LLMs. It is instrumental in improving the models' understanding and generation capabilities, particularly in specialized recruitment contexts.
- **Fairness in AI-assisted Hiring:** By serving as a benchmark for AI fairness, the Djinni dataset helps mitigate biases in AI-assisted recruitment processes, promoting more equitable hiring practices.
- **Recruitment Automation:** The dataset enables the development of tools for automated creation of resumes and job descriptions, streamlining the recruitment process.
- **Market Analysis:** It offers insights into the dynamics of Ukraine's tech sector, including the impacts of conflicts, aiding in comprehensive market analysis.
- **Trend Analysis and Topic Discovery:** The dataset facilitates modeling and classification for trend analysis and topic discovery within the tech industry.
- **Strategic Planning:** By enabling the automatic identification of company domains, the dataset assists in strategic market planning.
## BibTeX entry and citation info
*When publishing results based on this dataset please refer to:*
```bibtex
@inproceedings{djinni,
title = "Introducing the {D}jinni {R}ecruitment {D}ataset: A Corpus of Anonymized {CV}s and Job Postings",
author = "Drushchak, Nazarii and
Romanyshyn, Mariana",
booktitle = "Proceedings of the Third Ukrainian Natural Language Processing Workshop",
month = may,
year = "2024",
address = "Torino, Italy",
publisher = "European Language Resources Association",
}
```
## Attribution
Special thanks to [Djinni](https://djinni.co/) for providing this invaluable dataset. Their contribution is crucial in advancing research and development in AI, machine learning, and the broader tech industry. Their effort in compiling and sharing this dataset is greatly appreciated by the community. |
Chaymaa/meter_reading | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: test
num_bytes: 2982743.0
num_examples: 2
- name: train
num_bytes: 2992102.0
num_examples: 2
- name: validation
num_bytes: 2968918.0
num_examples: 2
download_size: 8863355
dataset_size: 8943763.0
---
# Dataset Card for "meter_reading"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
agmsb/gem-diabetes | ---
license: mit
language:
- en
--- |
DZN222/22 | ---
license: openrail
---
|
CyberHarem/helena_blavatsky_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of helena_blavatsky/エレナ・ブラヴァツキー/海伦娜·布拉瓦茨基 (Fate/Grand Order)
This is the dataset of helena_blavatsky/エレナ・ブラヴァツキー/海伦娜·布拉瓦茨基 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `purple_hair, purple_eyes, short_hair, small_breasts, breasts, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 674.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/helena_blavatsky_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 595.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/helena_blavatsky_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1289 | 1.18 GiB | [Download](https://huggingface.co/datasets/CyberHarem/helena_blavatsky_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/helena_blavatsky_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, bare_shoulders, detached_collar, detached_sleeves, looking_at_viewer, smile, solo, upper_body, white_sleeves, blush, closed_mouth, strapless_dress, flat_chest, simple_background, white_background, armpits, ribbon |
| 1 | 7 |  |  |  |  |  | 1girl, bare_shoulders, detached_collar, detached_sleeves, looking_at_viewer, solo, white_sleeves, belt, black_thighhighs, holding_book, smile, strapless_dress, flat_chest, simple_background, white_background, blush, hair_ribbon, closed_mouth |
| 2 | 10 |  |  |  |  |  | 1girl, bare_shoulders, black_thighhighs, open_mouth, solo, belt, detached_sleeves, looking_at_viewer, white_sleeves, blush, book, detached_collar, :d, strapless_dress |
| 3 | 6 |  |  |  |  |  | 1girl, bare_shoulders, black_thighhighs, detached_sleeves, looking_at_viewer, solo, white_sleeves, belt, detached_collar, black_dress, hair_ribbon, long_sleeves, smile, strapless_dress, blush, flat_chest, short_dress |
| 4 | 9 |  |  |  |  |  | 1girl, bare_shoulders, black_headwear, black_thighhighs, looking_at_viewer, solo, belt, detached_collar, open_coat, open_mouth, book, smile, beret, blush, long_sleeves, off_shoulder, strapless_dress, black_dress, simple_background, white_background, short_dress |
| 5 | 28 |  |  |  |  |  | 1girl, black_bikini, looking_at_viewer, solo, navel, smile, bare_shoulders, blush, ponytail, black_gloves, black_thighhighs, hair_bow, simple_background, throat_microphone, collarbone, garrison_cap, headphones, flat_chest, black_headwear, closed_mouth, white_background, ribbon |
| 6 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, school_swimsuit, solo, bare_shoulders, blue_one-piece_swimsuit, blush, covered_navel, hair_bow, name_tag, collarbone, simple_background, smile, white_background, ponytail |
| 7 | 5 |  |  |  |  |  | 1girl, beanie, blue_coat, blue_dress, blue_gloves, blue_headwear, blush, fur-trimmed_coat, fur-trimmed_dress, large_bow, long_sleeves, looking_at_viewer, red_bow, smile, solo, ankh, blue_footwear, boots, sack, badge, brown_pantyhose, open_mouth, santa_hat, black_pantyhose, christmas, hooded_coat, open_coat, snowing |
| 8 | 12 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, smile, solo, white_shirt, beret, red_vest, black_gloves, blush, collared_shirt, flower, ascot, hat_feather, pink_skirt, open_mouth, bow, frills, striped_clothes, striped_thighhighs, one_eye_closed, puffy_sleeves |
| 9 | 5 |  |  |  |  |  | 1girl, badge, bare_shoulders, looking_at_viewer, solo, sunglasses, visor_cap, blush, choker, shoulder_cutout, eyewear_on_headwear, hand_on_own_hip, open_mouth, short_shorts, :d, belt, bikini, blue_headwear, blue_sky, holding_megaphone, shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | detached_collar | detached_sleeves | looking_at_viewer | smile | solo | upper_body | white_sleeves | blush | closed_mouth | strapless_dress | flat_chest | simple_background | white_background | armpits | ribbon | belt | black_thighhighs | holding_book | hair_ribbon | open_mouth | book | :d | black_dress | long_sleeves | short_dress | black_headwear | open_coat | beret | off_shoulder | black_bikini | navel | ponytail | black_gloves | hair_bow | throat_microphone | collarbone | garrison_cap | headphones | school_swimsuit | blue_one-piece_swimsuit | covered_navel | name_tag | beanie | blue_coat | blue_dress | blue_gloves | blue_headwear | fur-trimmed_coat | fur-trimmed_dress | large_bow | red_bow | ankh | blue_footwear | boots | sack | badge | brown_pantyhose | santa_hat | black_pantyhose | christmas | hooded_coat | snowing | white_shirt | red_vest | collared_shirt | flower | ascot | hat_feather | pink_skirt | bow | frills | striped_clothes | striped_thighhighs | one_eye_closed | puffy_sleeves | sunglasses | visor_cap | choker | shoulder_cutout | eyewear_on_headwear | hand_on_own_hip | short_shorts | bikini | blue_sky | holding_megaphone | shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:------------------|:-------------------|:--------------------|:--------|:-------|:-------------|:----------------|:--------|:---------------|:------------------|:-------------|:--------------------|:-------------------|:----------|:---------|:-------|:-------------------|:---------------|:--------------|:-------------|:-------|:-----|:--------------|:---------------|:--------------|:-----------------|:------------|:--------|:---------------|:---------------|:--------|:-----------|:---------------|:-----------|:--------------------|:-------------|:---------------|:-------------|:------------------|:--------------------------|:----------------|:-----------|:---------|:------------|:-------------|:--------------|:----------------|:-------------------|:--------------------|:------------|:----------|:-------|:----------------|:--------|:-------|:--------|:------------------|:------------|:------------------|:------------|:--------------|:----------|:--------------|:-----------|:-----------------|:---------|:--------|:--------------|:-------------|:------|:---------|:------------------|:---------------------|:-----------------|:----------------|:-------------|:------------|:---------|:------------------|:----------------------|:------------------|:---------------|:---------|:-----------|:--------------------|:--------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | X | X | X | X | | X | | X | X | | X | | | | | | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | | X | X | | | | | X | X | | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | X | | X | X | X | | | X | | X | | X | X | | | X | X | | | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 28 |  |  |  |  |  | X | X | | | X | X | X | | | X | X | | X | X | X | | X | | X | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 12 |  |  |  |  |  | X | X | | | X | X | X | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | X | | X | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | | X | X | X | | | X | | | | | | | | | | | | X | | | | X | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 12 |  |  |  |  |  | X | | | | X | X | X | | | X | | | | | | | | | | | | X | | | | X | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | X | | | X | | X | | | X | | | | | | | | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
SankarV/processed_marketing_email_dataset | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 30474
num_examples: 17
download_size: 31271
dataset_size: 30474
---
|
autoevaluate/autoeval-eval-scan-simple-0b9bd3-1528755178 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- scan
eval_info:
task: summarization
model: ARTeLab/it5-summarization-fanpage
metrics: []
dataset_name: scan
dataset_config: simple
dataset_split: train
col_mapping:
text: commands
target: actions
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: ARTeLab/it5-summarization-fanpage
* Dataset: scan
* Config: simple
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@test_yoon_0921](https://huggingface.co/test_yoon_0921) for evaluating this model. |
chathuranga-jayanath/selfapr-manipulation-bug-error-context-all | ---
dataset_info:
features:
- name: fix
dtype: string
- name: ctx
dtype: string
splits:
- name: train
num_bytes: 415138320
num_examples: 664192
- name: validation
num_bytes: 52113778
num_examples: 83023
- name: test
num_bytes: 51491901
num_examples: 83023
download_size: 236457285
dataset_size: 518743999
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
ellenhp/libpostal | ---
dataset_info:
- config_name: geoplanet
features:
- name: lang
dtype: string
- name: country
dtype: string
- name: tokens
sequence: string
- name: tags
sequence: int64
splits:
- name: train
num_bytes: 14351561106.506264
num_examples: 130909357
- name: test
num_bytes: 144965292.49373704
num_examples: 1322317
download_size: 4039968748
dataset_size: 14496526399.0
- config_name: openaddresses
features:
- name: lang
dtype: string
- name: country
dtype: string
- name: tokens
sequence: string
- name: tags
sequence: int64
splits:
- name: train
num_bytes: 61976401746.036736
num_examples: 446673083
- name: test
num_bytes: 626024353.9632672
num_examples: 4511850
download_size: 21993145847
dataset_size: 62602426100.0
- config_name: openstreetmap_addresses
features:
- name: lang
dtype: string
- name: country
dtype: string
- name: tokens
sequence: string
- name: tags
sequence: int64
splits:
- name: train
num_bytes: 35406459284.34487
num_examples: 316914300
- name: test
num_bytes: 357641053.655127
num_examples: 3201155
download_size: 13366122418
dataset_size: 35764100338.0
- config_name: openstreetmap_places
features:
- name: lang
dtype: string
- name: country
dtype: string
- name: tokens
sequence: string
- name: tags
sequence: int64
splits:
- name: train
num_bytes: 4341603986.997948
num_examples: 48989431
- name: test
num_bytes: 43854609.00205241
num_examples: 494843
download_size: 1597409288
dataset_size: 4385458596.0
- config_name: openstreetmap_ways
features:
- name: lang
dtype: string
- name: country
dtype: string
- name: tokens
sequence: string
- name: tags
sequence: int64
splits:
- name: train
num_bytes: 9703643777.614073
num_examples: 72476682
- name: test
num_bytes: 98016644.3859272
num_examples: 732088
download_size: 3262932325
dataset_size: 9801660422.0
- config_name: uk_openaddresses
features:
- name: lang
dtype: string
- name: country
dtype: string
- name: tokens
sequence: string
- name: tags
sequence: int64
splits:
- name: train
num_bytes: 212476615.73347768
num_examples: 1724602
- name: test
num_bytes: 2146324.2665223135
num_examples: 17421
download_size: 50229957
dataset_size: 214622940.0
configs:
- config_name: geoplanet
data_files:
- split: train
path: geoplanet/train-*
- split: test
path: geoplanet/test-*
- config_name: openaddresses
data_files:
- split: train
path: openaddresses/train-*
- split: test
path: openaddresses/test-*
- config_name: openstreetmap_addresses
data_files:
- split: train
path: openstreetmap_addresses/train-*
- split: test
path: openstreetmap_addresses/test-*
- config_name: openstreetmap_places
data_files:
- split: train
path: openstreetmap_places/train-*
- split: test
path: openstreetmap_places/test-*
- config_name: openstreetmap_ways
data_files:
- split: train
path: openstreetmap_ways/train-*
- split: test
path: openstreetmap_ways/test-*
- config_name: uk_openaddresses
data_files:
- split: train
path: uk_openaddresses/train-*
- split: test
path: uk_openaddresses/test-*
---
# Under Construction: Libpostal training dataset
For licensing information refer to [libpostal readme](https://github.com/openvenues/libpostal/blob/57eaa414ceadb48d5922099eeaa446b02894a2e4/README.md#parser-training-sets)
|
jlbaker361/small_addition_decimal | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 1827.5555555555557
num_examples: 40
- name: test
num_bytes: 228.44444444444446
num_examples: 5
download_size: 4479
dataset_size: 2056.0
---
# Dataset Card for "small_addition_decimal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mvasiliniuc/iva-kotlin-codeint-clean-valid-tokenized | ---
dataset_info:
features:
- name: ratio
dtype: float64
- name: config_or_test
dtype: bool
- name: has_no_keywords
dtype: bool
- name: has_few_assignments
dtype: bool
- name: input_ids
sequence: int32
- name: ratio_char_token
dtype: float64
splits:
- name: train
num_bytes: 188689431
num_examples: 41843
download_size: 74311910
dataset_size: 188689431
---
# Dataset Card for "iva-kotlin-codeint-clean-valid-tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CJWeiss/lcr_final_id_rename | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 82132163
num_examples: 2918
- name: test
num_bytes: 18921115
num_examples: 584
- name: valid
num_bytes: 12959086
num_examples: 389
download_size: 56066938
dataset_size: 114012364
---
# Dataset Card for "lcr_final_id_rename"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jonavila/DRAL | ---
license: cc0-1.0
task_categories:
- translation
language:
- en
- es
---
# Dialogs Re-enacted Across Languages (DRAL) corpus
DRAL is a bilingual speech corpus of parallel utterances, using recorded conversations and fragments re-enacted in a different language. It is intended as a resource for research, especially for training and evaluating speech-to-speech translation models and systems. We dedicate this corpus to the public domain; there is no copyright (CC 0).
DRAL is described in a new technical report: [Dialogs Re-enacted Across Languages, Version 2](https://arxiv.org/abs/2211.11584), Nigel G. Ward, Jonathan E. Avila, Emilia Rivas, Divette Marco.
Some initial analyses of this data are described in our [Interspeech 2023 paper](https://arxiv.org/abs/2307.04123).
The releases include 2893 short matched Spanish-English pairs (> 2 hours) taken from 104 conversations with 70 unique participants. There are also some illustrative, lower-quality, pairs in Bengali-English, Japanese-English, and French-English. All are packaged together with the full original conversations and full re-enactment recording sessions.
## Links
- [DRAL home page](https://www.cs.utep.edu/nigel/dral/)
- [DRAL GitHub repo](https://github.com/joneavila/DRAL)
- [DRAL technical report](https://arxiv.org/abs/2211.11584)
- [Interspeech 2023 paper](https://arxiv.org/abs/2307.04123) |
AlekseyKorshuk/midjourney-prompts-text-dedup | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: __cluster__
dtype: int64
splits:
- name: train
num_bytes: 461516230.6301592
num_examples: 2802392
download_size: 212898248
dataset_size: 461516230.6301592
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Rahulrayudu/Crop_QA_Dataset | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 196085
num_examples: 500
download_size: 92189
dataset_size: 196085
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_fblgit__una-cybertron-7b-v2-bf16 | ---
pretty_name: Evaluation run of fblgit/una-cybertron-7b-v2-bf16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fblgit/una-cybertron-7b-v2-bf16](https://huggingface.co/fblgit/una-cybertron-7b-v2-bf16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fblgit__una-cybertron-7b-v2-bf16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T16:28:35.097444](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__una-cybertron-7b-v2-bf16/blob/main/results_2023-12-04T16-28-35.097444.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6349296405755961,\n\
\ \"acc_stderr\": 0.03261211472247009,\n \"acc_norm\": 0.6370258261406261,\n\
\ \"acc_norm_stderr\": 0.03327308531523366,\n \"mc1\": 0.48714810281517745,\n\
\ \"mc1_stderr\": 0.017497717944299825,\n \"mc2\": 0.646322826116642,\n\
\ \"mc2_stderr\": 0.015041829082644448\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6552901023890785,\n \"acc_stderr\": 0.01388881628678211,\n\
\ \"acc_norm\": 0.6825938566552902,\n \"acc_norm_stderr\": 0.013602239088038167\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6717785301732723,\n\
\ \"acc_stderr\": 0.004686062421158145,\n \"acc_norm\": 0.8584943238398726,\n\
\ \"acc_norm_stderr\": 0.0034783009945146925\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7645161290322581,\n \"acc_stderr\": 0.024137632429337714,\n \"\
acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.024137632429337714\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723875,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723875\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887037,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887037\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n\
\ \"acc_stderr\": 0.013853724170922534,\n \"acc_norm\": 0.8160919540229885,\n\
\ \"acc_norm_stderr\": 0.013853724170922534\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.024685316867257803,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.024685316867257803\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3787709497206704,\n\
\ \"acc_stderr\": 0.01622353351036511,\n \"acc_norm\": 0.3787709497206704,\n\
\ \"acc_norm_stderr\": 0.01622353351036511\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015685,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015685\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02492200116888633,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02492200116888633\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n\
\ \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n\
\ \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988633,\n\
\ \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988633\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578327,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48714810281517745,\n\
\ \"mc1_stderr\": 0.017497717944299825,\n \"mc2\": 0.646322826116642,\n\
\ \"mc2_stderr\": 0.015041829082644448\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8097868981846882,\n \"acc_stderr\": 0.01103033579861744\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5504169825625473,\n \
\ \"acc_stderr\": 0.013702290047884747\n }\n}\n```"
repo_url: https://huggingface.co/fblgit/una-cybertron-7b-v2-bf16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|arc:challenge|25_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|gsm8k|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hellaswag|10_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-28-35.097444.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T16-28-35.097444.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- '**/details_harness|winogrande|5_2023-12-04T16-28-35.097444.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T16-28-35.097444.parquet'
- config_name: results
data_files:
- split: 2023_12_04T16_28_35.097444
path:
- results_2023-12-04T16-28-35.097444.parquet
- split: latest
path:
- results_2023-12-04T16-28-35.097444.parquet
---
# Dataset Card for Evaluation run of fblgit/una-cybertron-7b-v2-bf16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/fblgit/una-cybertron-7b-v2-bf16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [fblgit/una-cybertron-7b-v2-bf16](https://huggingface.co/fblgit/una-cybertron-7b-v2-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fblgit__una-cybertron-7b-v2-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T16:28:35.097444](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__una-cybertron-7b-v2-bf16/blob/main/results_2023-12-04T16-28-35.097444.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6349296405755961,
"acc_stderr": 0.03261211472247009,
"acc_norm": 0.6370258261406261,
"acc_norm_stderr": 0.03327308531523366,
"mc1": 0.48714810281517745,
"mc1_stderr": 0.017497717944299825,
"mc2": 0.646322826116642,
"mc2_stderr": 0.015041829082644448
},
"harness|arc:challenge|25": {
"acc": 0.6552901023890785,
"acc_stderr": 0.01388881628678211,
"acc_norm": 0.6825938566552902,
"acc_norm_stderr": 0.013602239088038167
},
"harness|hellaswag|10": {
"acc": 0.6717785301732723,
"acc_stderr": 0.004686062421158145,
"acc_norm": 0.8584943238398726,
"acc_norm_stderr": 0.0034783009945146925
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.024137632429337714,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.024137632429337714
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723875,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723875
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887037,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887037
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922534,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922534
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.024685316867257803,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.024685316867257803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3787709497206704,
"acc_stderr": 0.01622353351036511,
"acc_norm": 0.3787709497206704,
"acc_norm_stderr": 0.01622353351036511
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.02671611838015685,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.02671611838015685
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02492200116888633,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02492200116888633
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.028888193103988633,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.028888193103988633
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578327,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48714810281517745,
"mc1_stderr": 0.017497717944299825,
"mc2": 0.646322826116642,
"mc2_stderr": 0.015041829082644448
},
"harness|winogrande|5": {
"acc": 0.8097868981846882,
"acc_stderr": 0.01103033579861744
},
"harness|gsm8k|5": {
"acc": 0.5504169825625473,
"acc_stderr": 0.013702290047884747
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
alzoubi36/privaseer_demo | ---
language: en
license: gpl-3.0
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
- name: hash
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 38674924
num_examples: 4000
download_size: 18262815
dataset_size: 38674924
---
## Privaseer Dataset Demo
Huggingface version of the demo [Privaseer](https://privaseer.ist.psu.edu/) dataset.
<pre>
@inproceedings{srinath-etal-2021-privacy,
title = "Privacy at Scale: Introducing the {P}riva{S}eer Corpus of Web Privacy Policies",
author = "Srinath, Mukund and
Wilson, Shomir and
Giles, C Lee",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.532",
doi = "10.18653/v1/2021.acl-long.532",
pages = "6829--6839",
abstract = "Organisations disclose their privacy practices by posting privacy policies on their websites. Even though internet users often care about their digital privacy, they usually do not read privacy policies, since understanding them requires a significant investment of time and effort. Natural language processing has been used to create experimental tools to interpret privacy policies, but there has been a lack of large privacy policy corpora to facilitate the creation of large-scale semi-supervised and unsupervised models to interpret and simplify privacy policies. Thus, we present the PrivaSeer Corpus of 1,005,380 English language website privacy policies collected from the web. The number of unique websites represented in PrivaSeer is about ten times larger than the next largest public collection of web privacy policies, and it surpasses the aggregate of unique websites represented in all other publicly available privacy policy corpora combined. We describe a corpus creation pipeline with stages that include a web crawler, language detection, document classification, duplicate and near-duplicate removal, and content extraction. We employ an unsupervised topic modelling approach to investigate the contents of policy documents in the corpus and discuss the distribution of topics in privacy policies at web scale. We further investigate the relationship between privacy policy domain PageRanks and text features of the privacy policies. Finally, we use the corpus to pretrain PrivBERT, a transformer-based privacy policy language model, and obtain state of the art results on the data practice classification and question answering tasks.",}
</pre> |
open-llm-leaderboard/details_saberai__Zro1.5_3B | ---
pretty_name: Evaluation run of saberai/Zro1.5_3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [saberai/Zro1.5_3B](https://huggingface.co/saberai/Zro1.5_3B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_saberai__Zro1.5_3B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-27T19:33:43.363454](https://huggingface.co/datasets/open-llm-leaderboard/details_saberai__Zro1.5_3B/blob/main/results_2023-12-27T19-33-43.363454.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2628913556214231,\n\
\ \"acc_stderr\": 0.031108716303916813,\n \"acc_norm\": 0.2632892835008201,\n\
\ \"acc_norm_stderr\": 0.03179345445075825,\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.36891896664444634,\n\
\ \"mc2_stderr\": 0.01421300651619945\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3216723549488055,\n \"acc_stderr\": 0.013650488084494166,\n\
\ \"acc_norm\": 0.35921501706484643,\n \"acc_norm_stderr\": 0.014020224155839152\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4644493128858793,\n\
\ \"acc_stderr\": 0.0049771527464785885,\n \"acc_norm\": 0.6111332403903604,\n\
\ \"acc_norm_stderr\": 0.004864966792310698\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307811,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307811\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.029896145682095462,\n\
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.029896145682095462\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.0383515395439942,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.0383515395439942\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.02256989707491842,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02256989707491842\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n\
\ \"acc_stderr\": 0.024892469172462843,\n \"acc_norm\": 0.25806451612903225,\n\
\ \"acc_norm_stderr\": 0.024892469172462843\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.26262626262626265,\n \"acc_stderr\": 0.03135305009533085,\n \"\
acc_norm\": 0.26262626262626265,\n \"acc_norm_stderr\": 0.03135305009533085\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752947,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752947\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2358974358974359,\n \"acc_stderr\": 0.021525965407408726,\n\
\ \"acc_norm\": 0.2358974358974359,\n \"acc_norm_stderr\": 0.021525965407408726\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.02772206549336127,\n\
\ \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.02772206549336127\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24954128440366974,\n \"acc_stderr\": 0.018553897629501624,\n \"\
acc_norm\": 0.24954128440366974,\n \"acc_norm_stderr\": 0.018553897629501624\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.23148148148148148,\n \"acc_stderr\": 0.028765111718046955,\n \"\
acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.028765111718046955\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.03096451792692339,\n \"\
acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.03096451792692339\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293433,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293433\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.30493273542600896,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.30493273542600896,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3140495867768595,\n \"acc_stderr\": 0.04236964753041018,\n \"\
acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.04236964753041018\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384493,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384493\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23076923076923078,\n\
\ \"acc_stderr\": 0.027601921381417593,\n \"acc_norm\": 0.23076923076923078,\n\
\ \"acc_norm_stderr\": 0.027601921381417593\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26181353767560667,\n\
\ \"acc_stderr\": 0.01572083867844526,\n \"acc_norm\": 0.26181353767560667,\n\
\ \"acc_norm_stderr\": 0.01572083867844526\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.022797110278071138,\n\
\ \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.022797110278071138\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.02526169121972948,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.02526169121972948\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n\
\ \"acc_stderr\": 0.025218040373410626,\n \"acc_norm\": 0.27009646302250806,\n\
\ \"acc_norm_stderr\": 0.025218040373410626\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.29012345679012347,\n \"acc_stderr\": 0.025251173936495022,\n\
\ \"acc_norm\": 0.29012345679012347,\n \"acc_norm_stderr\": 0.025251173936495022\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.02551873104953777,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.02551873104953777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.242503259452412,\n\
\ \"acc_stderr\": 0.01094657096634877,\n \"acc_norm\": 0.242503259452412,\n\
\ \"acc_norm_stderr\": 0.01094657096634877\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1948529411764706,\n \"acc_stderr\": 0.024060599423487414,\n\
\ \"acc_norm\": 0.1948529411764706,\n \"acc_norm_stderr\": 0.024060599423487414\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24673202614379086,\n \"acc_stderr\": 0.017440820367402503,\n \
\ \"acc_norm\": 0.24673202614379086,\n \"acc_norm_stderr\": 0.017440820367402503\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n\
\ \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n\
\ \"acc_stderr\": 0.029705284056772436,\n \"acc_norm\": 0.22885572139303484,\n\
\ \"acc_norm_stderr\": 0.029705284056772436\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\
\ \"acc_stderr\": 0.03664314777288086,\n \"acc_norm\": 0.3313253012048193,\n\
\ \"acc_norm_stderr\": 0.03664314777288086\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03377310252209196,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03377310252209196\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.36891896664444634,\n\
\ \"mc2_stderr\": 0.01421300651619945\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5872138910812944,\n \"acc_stderr\": 0.013837060648682105\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09931766489764973,\n \
\ \"acc_stderr\": 0.008238371412683984\n }\n}\n```"
repo_url: https://huggingface.co/saberai/Zro1.5_3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|arc:challenge|25_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|gsm8k|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hellaswag|10_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T19-33-43.363454.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-27T19-33-43.363454.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- '**/details_harness|winogrande|5_2023-12-27T19-33-43.363454.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-27T19-33-43.363454.parquet'
- config_name: results
data_files:
- split: 2023_12_27T19_33_43.363454
path:
- results_2023-12-27T19-33-43.363454.parquet
- split: latest
path:
- results_2023-12-27T19-33-43.363454.parquet
---
# Dataset Card for Evaluation run of saberai/Zro1.5_3B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [saberai/Zro1.5_3B](https://huggingface.co/saberai/Zro1.5_3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_saberai__Zro1.5_3B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-27T19:33:43.363454](https://huggingface.co/datasets/open-llm-leaderboard/details_saberai__Zro1.5_3B/blob/main/results_2023-12-27T19-33-43.363454.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2628913556214231,
"acc_stderr": 0.031108716303916813,
"acc_norm": 0.2632892835008201,
"acc_norm_stderr": 0.03179345445075825,
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.36891896664444634,
"mc2_stderr": 0.01421300651619945
},
"harness|arc:challenge|25": {
"acc": 0.3216723549488055,
"acc_stderr": 0.013650488084494166,
"acc_norm": 0.35921501706484643,
"acc_norm_stderr": 0.014020224155839152
},
"harness|hellaswag|10": {
"acc": 0.4644493128858793,
"acc_stderr": 0.0049771527464785885,
"acc_norm": 0.6111332403903604,
"acc_norm_stderr": 0.004864966792310698
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307811,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307811
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.029896145682095462,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.029896145682095462
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0383515395439942,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0383515395439942
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02256989707491842,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02256989707491842
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.024892469172462843,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.024892469172462843
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.26262626262626265,
"acc_stderr": 0.03135305009533085,
"acc_norm": 0.26262626262626265,
"acc_norm_stderr": 0.03135305009533085
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752947,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752947
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2358974358974359,
"acc_stderr": 0.021525965407408726,
"acc_norm": 0.2358974358974359,
"acc_norm_stderr": 0.021525965407408726
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.02772206549336127,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.02772206549336127
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24954128440366974,
"acc_stderr": 0.018553897629501624,
"acc_norm": 0.24954128440366974,
"acc_norm_stderr": 0.018553897629501624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.028765111718046955,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.028765111718046955
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.03096451792692339,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.03096451792692339
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293433,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293433
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.30493273542600896,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.30493273542600896,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3140495867768595,
"acc_stderr": 0.04236964753041018,
"acc_norm": 0.3140495867768595,
"acc_norm_stderr": 0.04236964753041018
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.027601921381417593,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.027601921381417593
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26181353767560667,
"acc_stderr": 0.01572083867844526,
"acc_norm": 0.26181353767560667,
"acc_norm_stderr": 0.01572083867844526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.022797110278071138,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.022797110278071138
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.025218040373410626,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.025218040373410626
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.29012345679012347,
"acc_stderr": 0.025251173936495022,
"acc_norm": 0.29012345679012347,
"acc_norm_stderr": 0.025251173936495022
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.02551873104953777,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.02551873104953777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.242503259452412,
"acc_stderr": 0.01094657096634877,
"acc_norm": 0.242503259452412,
"acc_norm_stderr": 0.01094657096634877
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1948529411764706,
"acc_stderr": 0.024060599423487414,
"acc_norm": 0.1948529411764706,
"acc_norm_stderr": 0.024060599423487414
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24673202614379086,
"acc_stderr": 0.017440820367402503,
"acc_norm": 0.24673202614379086,
"acc_norm_stderr": 0.017440820367402503
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772436,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772436
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.03664314777288086,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.03664314777288086
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.03377310252209196,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.03377310252209196
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.36891896664444634,
"mc2_stderr": 0.01421300651619945
},
"harness|winogrande|5": {
"acc": 0.5872138910812944,
"acc_stderr": 0.013837060648682105
},
"harness|gsm8k|5": {
"acc": 0.09931766489764973,
"acc_stderr": 0.008238371412683984
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
cjvt/sloleks | ---
license: cc-by-sa-4.0
---
# Dataset Card for Sloleks 3
**Important**: this is a mostly complete script for processing Sloleks 3. Certain data properties may not be exposed through the script.
Please see the [CLARIN repository](https://www.clarin.si/repository/xmlui/handle/11356/1745) for full details on what the dataset contains, and open an issue or a pull request if you require some other information from the raw data.
### Dataset Summary
Sloleks is a reference morphological lexicon of Slovene that was developed to be used in various NLP applications and language manuals.
It contains Slovene lemmas, their inflected or derivative word forms and the corresponding grammatical description.
In addition to the approx. 100,000 entries already available in [Sloleks 2.0](http://hdl.handle.net/11356/1230), Sloleks 3.0 contains an additional
cca. 265,000 newly generated entries from the most frequent lemmas in [Gigafida 2.0](http://hdl.handle.net/11356/1320) not yet included in previous versions of Sloleks.
For verbs, adjectives, adverbs, and common nouns, the lemmas were checked manually by three annotators and included in Sloleks only if confirmed as legitimate by at
least one annotator. No manual checking was performed on proper nouns. Lemmatization rules, part-of-speech categorization and the set of feature-value pairs follow the
[MULTEXT-East morphosyntactic specifications for Slovenian](https://nl.ijs.si/ME/V6/msd/html/msd-sl.html).
### Supported Tasks and Leaderboards
Other (the data is a knowledge base - lexicon).
### Languages
Slovenian.
## Dataset Structure
### Data Instances
Entry for the verb `absorbirati` (English: *to absorb*):
```
{
'headword_lemma': 'absorbirati',
'pos': 'verb',
'lex_unit': {'id': 'LE_a293f9ab871299f116dff2cc1421367a', 'form': 'absorbirati', 'key': 'G_absorbirati', 'type': 'single'},
'word_forms':
[
{'forms': ['absorbirati'], 'msd': 'Ggvn'},
{'forms': ['absorbirat'], 'msd': 'Ggvm'},
{'forms': ['absorbiral'], 'msd': 'Ggvd-em'},
{'forms': ['absorbirala'], 'msd': 'Ggvd-dm'},
{'forms': ['absorbirali'], 'msd': 'Ggvd-mm'},
{'forms': ['absorbirala'], 'msd': 'Ggvd-ez'},
{'forms': ['absorbirali'], 'msd': 'Ggvd-dz'},
{'forms': ['absorbirale'], 'msd': 'Ggvd-mz'},
{'forms': ['absorbiralo'], 'msd': 'Ggvd-es'},
{'forms': ['absorbirali'], 'msd': 'Ggvd-ds'},
{'forms': ['absorbirala'], 'msd': 'Ggvd-ms'},
{'forms': ['absorbiram'], 'msd': 'Ggvspe'},
{'forms': ['absorbiraš'], 'msd': 'Ggvsde'},
{'forms': ['absorbira'], 'msd': 'Ggvste'},
{'forms': ['absorbirava'], 'msd': 'Ggvspd'},
{'forms': ['absorbirata'], 'msd': 'Ggvsdd'},
{'forms': ['absorbirata'], 'msd': 'Ggvstd'},
{'forms': ['absorbiramo'], 'msd': 'Ggvspm'},
{'forms': ['absorbirate'], 'msd': 'Ggvsdm'},
{'forms': ['absorbirajo'], 'msd': 'Ggvstm'},
{'forms': ['absorbirajva'], 'msd': 'Ggvvpd'},
{'forms': ['absorbirajmo'], 'msd': 'Ggvvpm'},
{'forms': ['absorbiraj'], 'msd': 'Ggvvde'},
{'forms': ['absorbirajta'], 'msd': 'Ggvvdd'},
{'forms': ['absorbirajte'], 'msd': 'Ggvvdm'}
],
'is_manually_checked': True
}
```
### Data Fields
- `headword_lemma`: lemma of the headword;
- `pos`: coarse-grained part-of-speech tag (one of `{"noun", "verb", "adjective", "adverb", "pronoun", "numeral", "preposition", "conjunction", "particle", "interjection", "abbreviation", "residual"}`);
- `lex_unit`: properties of the lexical unit corresponding to the headword (`id`, `form`, `key` and `type`);
- `word_forms`: forms of the headword, each with its own list of possible forms and the morphosyntactic description of the form;
- `is_manually_checked`: whether the headword was manually validated or not.
## Additional Information
### Dataset Curators
Jaka Čibej; et al. (please see http://hdl.handle.net/11356/1745 for the full list).
### Licensing Information
CC BY-SA 4.0.
### Citation Information
```
@misc{sloleks3,
title = {Morphological lexicon Sloleks 3.0},
author = {{\v C}ibej, Jaka and Gantar, Kaja and Dobrovoljc, Kaja and Krek, Simon and Holozan, Peter and Erjavec, Toma{\v z} and Romih, Miro and Arhar Holdt, {\v S}pela and Krsnik, Luka and Robnik-{\v S}ikonja, Marko},
url = {http://hdl.handle.net/11356/1745},
note = {Slovenian language resource repository {CLARIN}.{SI}},
copyright = {Creative Commons - Attribution-{ShareAlike} 4.0 International ({CC} {BY}-{SA} 4.0)},
year = {2022}
}
```
### Contributions
Thanks to [@matejklemen](https://github.com/matejklemen) for adding this dataset.
|
kenhktsui/TM-DATA_quality_score_v1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: quality_score_v1
dtype: float64
splits:
- name: train
num_bytes: 28772807486
num_examples: 8140641
download_size: 16400919120
dataset_size: 28772807486
---
# Dataset Card for "TM-DATA_quality_score_v1"
Adding quality score v1 to [Locutusque/TM-DATA](https://huggingface.co/datasets/Locutusque/TM-DATA)
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ssssasdasdasdasdqwd/v3_gameasset_lora | ---
license: unknown
---
|
nluai/ZaloAI_Format_PhoGPT | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: validation
num_bytes: 262654
num_examples: 687
download_size: 132502
dataset_size: 262654
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
atmallen/mmlu_aux_chat_binary | ---
dataset_info:
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype: int32
- name: statement
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: validation
num_bytes: 10714952
num_examples: 4036
- name: test
num_bytes: 101960767
num_examples: 37506
download_size: 50210816
dataset_size: 112675719
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu_aux_chat_binary"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deetsadi/processed_dwi | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: conditioning_image
dtype: image
splits:
- name: train
num_bytes: 15336901.0
num_examples: 200
download_size: 0
dataset_size: 15336901.0
---
# Dataset Card for "processed_dwi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_80 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1292124696
num_examples: 251778
download_size: 1318926483
dataset_size: 1292124696
---
# Dataset Card for "chunk_80"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pkboom/codegen-v2 | ---
dataset_info:
features:
- name: index
dtype: int64
- name: repo_id
dtype: string
- name: file_path
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 1077609
num_examples: 272
download_size: 370057
dataset_size: 1077609
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "codegen-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CortexLM/dalle-3-dataset | ---
license: unknown
task_categories:
- text-to-image
language:
- en
---
# **DALL·E 3 Dataset by Bittensor Network (NetUID 18)**
**Description :** This dataset was generated by Subnetwork 18 (Bittensor), utilizing the capabilities of DALL·E 3.
**WanDB :** [Cortex-T Wandb](https://wandb.ai/cortex-t/synthetic-QA/)
**Disclaimer: Image Attribution and Copyright Notice**
The images included in this dataset have been sourced from WandB (Weights and Biases). While every effort has been made to ensure compliance with copyright and intellectual property rights, Cortex Foundation cannot guarantee the absence of any copyright or intellectual property infringements.
Cortex Foundation assumes no responsibility or liability for any potential copyright issues associated with the images in this dataset. Users of this dataset are strongly encouraged to verify the copyright status of individual images and ensure compliance with applicable laws and regulations before using or redistributing the dataset.
By accessing and using this dataset, you acknowledge and agree that Cortex Foundation is not responsible for any copyright violations or legal consequences that may arise from the use of these images.
If you have any concerns or questions regarding the copyright status of specific images, please contact the original source or copyright holder directly.
Cortex Foundation reserves the right to update or modify this disclaimer as needed to reflect any changes in the dataset's composition or to address emerging legal or ethical considerations. |
result-kand2-sdxl-wuerst-karlo/38127251 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 193
num_examples: 10
download_size: 1396
dataset_size: 193
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "38127251"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Giacinta/weibo_ai | ---
license: apache-2.0
---
|
CyberHarem/new_orleans_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of new_orleans/ニューオリンズ/新奥尔良 (Azur Lane)
This is the dataset of new_orleans/ニューオリンズ/新奥尔良 (Azur Lane), containing 39 images and their tags.
The core tags of this character are `short_hair, breasts, large_breasts, blue_eyes, bangs, grey_hair, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 39 | 45.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/new_orleans_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 39 | 28.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/new_orleans_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 90 | 55.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/new_orleans_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 39 | 41.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/new_orleans_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 90 | 74.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/new_orleans_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/new_orleans_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, solo, looking_at_viewer, smile, black_gloves, cleavage, blush, closed_mouth, forehead, long_sleeves, black_thighhighs, red_necktie, simple_background, white_background, parted_bangs, black_dress |
| 1 | 6 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, earrings, looking_at_viewer, smile, solo, bracelet, fake_animal_ears, rabbit_ears, sitting, wine_glass, cleavage_cutout, covered_navel, crossed_legs, official_alternate_costume, stool, closed_mouth, full_body, high_heels, holding_cup, sleeveless |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | smile | black_gloves | cleavage | blush | closed_mouth | forehead | long_sleeves | black_thighhighs | red_necktie | simple_background | white_background | parted_bangs | black_dress | bare_shoulders | earrings | bracelet | fake_animal_ears | rabbit_ears | sitting | wine_glass | cleavage_cutout | covered_navel | crossed_legs | official_alternate_costume | stool | full_body | high_heels | holding_cup | sleeveless |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:---------------|:-----------|:--------|:---------------|:-----------|:---------------|:-------------------|:--------------|:--------------------|:-------------------|:---------------|:--------------|:-----------------|:-----------|:-----------|:-------------------|:--------------|:----------|:-------------|:------------------|:----------------|:---------------|:-----------------------------|:--------|:------------|:-------------|:--------------|:-------------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/lilina_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lilina (Fire Emblem)
This is the dataset of lilina (Fire Emblem), containing 392 images and their tags.
The core tags of this character are `blue_hair, long_hair, blue_eyes, hat, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 392 | 421.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilina_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 392 | 270.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilina_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 813 | 520.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilina_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 392 | 387.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilina_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 813 | 690.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilina_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lilina_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, dress, looking_at_viewer, simple_background, solo, smile, white_background, blush, open_mouth, book, gloves, jewelry |
| 1 | 5 |  |  |  |  |  | 1girl, red_capelet, red_headwear, shiny_hair, simple_background, solo, upper_body, bangs, dress, hair_between_eyes, looking_at_viewer, white_background, blush, jewelry, :d, open_mouth |
| 2 | 9 |  |  |  |  |  | 1boy, 1girl, smile, dress, open_mouth, simple_background, couple, gloves, armor, blush, cape, closed_eyes, hetero, red_hair, white_background, jewelry, short_hair |
| 3 | 10 |  |  |  |  |  | bridal_veil, smile, wedding_dress, 1girl, bride, official_alternate_costume, solo, white_dress, cleavage, looking_at_viewer, simple_background, white_background, elbow_gloves, hair_flower, open_mouth, white_gloves, upper_body, white_flower, blush, holding_bouquet |
| 4 | 5 |  |  |  |  |  | 1girl, bangs, bouquet, bridal_veil, elbow_gloves, feather_trim, flower, full_body, holding, medium_breasts, solo, thigh_boots, thighhighs, wedding_dress, white_dress, white_footwear, white_gloves, bride, cleavage, detached_collar, gold_trim, hair_ornament, shiny_hair, blush, looking_away, open_mouth, simple_background, smile, white_background, closed_mouth, feathers, high_heel_boots, jewelry, looking_at_viewer, petals, standing, transparent_background |
| 5 | 21 |  |  |  |  |  | 1girl, red_bikini, hair_flower, navel, head_wreath, smile, open_mouth, solo, bangs, looking_at_viewer, official_alternate_costume, holding, blush, hibiscus, water, cloud, day, jewelry, sky, ocean, outdoors |
| 6 | 7 |  |  |  |  |  | 1girl, hetero, mosaic_censoring, nipples, penis, sex, solo_focus, vaginal, 1boy, blush, pantyhose, torn_clothes, cum_in_pussy, large_breasts, open_mouth, medium_breasts, navel, straddling, topless |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | dress | looking_at_viewer | simple_background | solo | smile | white_background | blush | open_mouth | book | gloves | jewelry | red_capelet | red_headwear | shiny_hair | upper_body | bangs | hair_between_eyes | :d | 1boy | couple | armor | cape | closed_eyes | hetero | red_hair | short_hair | bridal_veil | wedding_dress | bride | official_alternate_costume | white_dress | cleavage | elbow_gloves | hair_flower | white_gloves | white_flower | holding_bouquet | bouquet | feather_trim | flower | full_body | holding | medium_breasts | thigh_boots | thighhighs | white_footwear | detached_collar | gold_trim | hair_ornament | looking_away | closed_mouth | feathers | high_heel_boots | petals | standing | transparent_background | red_bikini | navel | head_wreath | hibiscus | water | cloud | day | sky | ocean | outdoors | mosaic_censoring | nipples | penis | sex | solo_focus | vaginal | pantyhose | torn_clothes | cum_in_pussy | large_breasts | straddling | topless |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:--------------------|:-------|:--------|:-------------------|:--------|:-------------|:-------|:---------|:----------|:--------------|:---------------|:-------------|:-------------|:--------|:--------------------|:-----|:-------|:---------|:--------|:-------|:--------------|:---------|:-----------|:-------------|:--------------|:----------------|:--------|:-----------------------------|:--------------|:-----------|:---------------|:--------------|:---------------|:---------------|:------------------|:----------|:---------------|:---------|:------------|:----------|:-----------------|:--------------|:-------------|:-----------------|:------------------|:------------|:----------------|:---------------|:---------------|:-----------|:------------------|:---------|:-----------|:-------------------------|:-------------|:--------|:--------------|:-----------|:--------|:--------|:------|:------|:--------|:-----------|:-------------------|:----------|:--------|:------|:-------------|:----------|:------------|:---------------|:---------------|:----------------|:-------------|:----------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | X | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | | X | | X | X | X | X | | X | X | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | | X | X | X | X | X | X | X | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | X | X | X | X | X | X | | | X | | | X | | X | | | | | | | | | | | X | X | X | | X | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 21 |  |  |  |  |  | X | | X | | X | X | | X | X | | | X | | | | | X | | | | | | | | | | | | | | X | | | | X | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | | | | | | | X | X | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
alirzb/DuringSeizurePlots | ---
dataset_info:
features:
- name: HI-normo-term
dtype: image
- name: HI-hypo-term
dtype: image
- name: HI-normo-preterm
dtype: image
splits:
- name: train
num_bytes: 26823.0
num_examples: 1
download_size: 28638
dataset_size: 26823.0
---
# Dataset Card for "DuringSeizurePlots"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vtiyyal1/AskDocsEmpathy_gemma_it | ---
dataset_info:
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 16002022.0
num_examples: 6124
download_size: 6753119
dataset_size: 16002022.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
PlayerJ/iOneBot_Custom_llama2_0318 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3453
num_examples: 9
download_size: 3435
dataset_size: 3453
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_allknowingroger__RasGullaINEX12-7B-slerp | ---
pretty_name: Evaluation run of allknowingroger/RasGullaINEX12-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [allknowingroger/RasGullaINEX12-7B-slerp](https://huggingface.co/allknowingroger/RasGullaINEX12-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_allknowingroger__RasGullaINEX12-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-10T21:06:43.770096](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__RasGullaINEX12-7B-slerp/blob/main/results_2024-04-10T21-06-43.770096.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6546041964673143,\n\
\ \"acc_stderr\": 0.03202128002848906,\n \"acc_norm\": 0.6538258900767938,\n\
\ \"acc_norm_stderr\": 0.0326910338531276,\n \"mc1\": 0.5410036719706243,\n\
\ \"mc1_stderr\": 0.017444544447661203,\n \"mc2\": 0.7067770453389669,\n\
\ \"mc2_stderr\": 0.014555521195806487\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.013582571095815291,\n\
\ \"acc_norm\": 0.7047781569965871,\n \"acc_norm_stderr\": 0.013329750293382316\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7059350726946824,\n\
\ \"acc_stderr\": 0.004546901132945117,\n \"acc_norm\": 0.8792073292172874,\n\
\ \"acc_norm_stderr\": 0.003252201593451836\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.032081157507886836,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.032081157507886836\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568624,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568624\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n\
\ \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n\
\ \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460842,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460842\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.012752858346533126,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.012752858346533126\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5410036719706243,\n\
\ \"mc1_stderr\": 0.017444544447661203,\n \"mc2\": 0.7067770453389669,\n\
\ \"mc2_stderr\": 0.014555521195806487\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8263614838200474,\n \"acc_stderr\": 0.010646116480330996\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7225170583775588,\n \
\ \"acc_stderr\": 0.012333447581047536\n }\n}\n```"
repo_url: https://huggingface.co/allknowingroger/RasGullaINEX12-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|arc:challenge|25_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|gsm8k|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hellaswag|10_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T21-06-43.770096.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T21-06-43.770096.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- '**/details_harness|winogrande|5_2024-04-10T21-06-43.770096.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-10T21-06-43.770096.parquet'
- config_name: results
data_files:
- split: 2024_04_10T21_06_43.770096
path:
- results_2024-04-10T21-06-43.770096.parquet
- split: latest
path:
- results_2024-04-10T21-06-43.770096.parquet
---
# Dataset Card for Evaluation run of allknowingroger/RasGullaINEX12-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [allknowingroger/RasGullaINEX12-7B-slerp](https://huggingface.co/allknowingroger/RasGullaINEX12-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_allknowingroger__RasGullaINEX12-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-10T21:06:43.770096](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__RasGullaINEX12-7B-slerp/blob/main/results_2024-04-10T21-06-43.770096.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6546041964673143,
"acc_stderr": 0.03202128002848906,
"acc_norm": 0.6538258900767938,
"acc_norm_stderr": 0.0326910338531276,
"mc1": 0.5410036719706243,
"mc1_stderr": 0.017444544447661203,
"mc2": 0.7067770453389669,
"mc2_stderr": 0.014555521195806487
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.013582571095815291,
"acc_norm": 0.7047781569965871,
"acc_norm_stderr": 0.013329750293382316
},
"harness|hellaswag|10": {
"acc": 0.7059350726946824,
"acc_stderr": 0.004546901132945117,
"acc_norm": 0.8792073292172874,
"acc_norm_stderr": 0.003252201593451836
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.032081157507886836,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.032081157507886836
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298902,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298902
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568624,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568624
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993457,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993457
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460842,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533126,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533126
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5410036719706243,
"mc1_stderr": 0.017444544447661203,
"mc2": 0.7067770453389669,
"mc2_stderr": 0.014555521195806487
},
"harness|winogrande|5": {
"acc": 0.8263614838200474,
"acc_stderr": 0.010646116480330996
},
"harness|gsm8k|5": {
"acc": 0.7225170583775588,
"acc_stderr": 0.012333447581047536
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
quantity/mydataset1 | ---
license: apache-2.0
---
|
keonroohparvar/music_vid_256 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 93921778.139
num_examples: 9179
download_size: 80840102
dataset_size: 93921778.139
---
# Dataset Card for "music_vid_256"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zd21/SciInstruct | ---
license: apache-2.0
---
|
harshasurampudi/legal-reasoning-lfqa-synthetic | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: Context
dtype: string
- name: Question
dtype: string
- name: Legal Reasoning
dtype: string
- name: Answer
dtype: string
splits:
- name: train
num_bytes: 31911499
num_examples: 14991
- name: test
num_bytes: 3176252
num_examples: 1497
- name: validation
num_bytes: 3186381
num_examples: 1496
download_size: 21924127
dataset_size: 38274132
size_categories:
- 10K<n<100K
---
# Dataset Card for "legal-reasoning-lfqa-synthetic"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/mmarco_fr_train | ---
pretty_name: '`mmarco/fr/train`'
viewer: false
source_datasets: ['irds/mmarco_fr']
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/fr/train`
The `mmarco/fr/train` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/fr/train).
# Data
This dataset provides:
- `queries` (i.e., topics); count=808,731
- `qrels`: (relevance assessments); count=532,761
- `docpairs`; count=39,780,811
- For `docs`, use [`irds/mmarco_fr`](https://huggingface.co/datasets/irds/mmarco_fr)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mmarco_fr_train', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mmarco_fr_train', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
docpairs = load_dataset('irds/mmarco_fr_train', 'docpairs')
for record in docpairs:
record # {'query_id': ..., 'doc_id_a': ..., 'doc_id_b': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
AyoubChLin/CNN_News_Articles_2011-2022 | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
pretty_name: CNN News Article from 20211 to 2022
size_categories:
- n<1K
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': business
'1': entertainment
'2': health
'3': news
'4': politics
'5': sport
splits:
- name: train
num_examples: 32218
- name: test
num_examples: 5686
train-eval-index:
- config: default
task: text-classification
task_id: multi_class_classification
splits:
train_split: train
eval_split: test
col_mapping:
text: text
label: target
---
# CNN News Articles 2011-2022 Dataset
## Introduction
This dataset contains CNN News Articles from 2011 to 2022 after basic cleaning. The dataset includes the following information:
Category
Full text
The data was downloaded from Kaggle at this URL: https://www.kaggle.com/datasets/hadasu92/cnn-articles-after-basic-cleaning. The dataset was split into two sets:
Train set with 32,218 examples
Test set with 5,686 examples
## Usage
This dataset can be used for different natural language processing tasks such as text classification, text summarization, named entity recognition, and more. The dataset is available in Hugging Face Datasets with the ID AyoubChLin/CNN_News_Articles_2011-2022.
## Acknowledgements
The data was collected by the Kaggle user [hadasu92](https://github.com/hadasu). The splitting of the dataset into train and test sets was performed by [CHERGUELAINE Ayoub](https://www.linkedin.com/in/ayoub-cherguelaine/) & [BOUBEKRI Faycal](https://www.linkedin.com/in/faycal-boubekri-832848199/). |
arpachat/FashionTextImageSmall | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1176081.0
num_examples: 250
download_size: 750208
dataset_size: 1176081.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ASIDS/LewdRuStoryforTrain | ---
language:
- ru
tags:
- not-for-all-audiences
- instruction-finetuning
pretty_name: LewdRuStoryforTrain
size_categories:
- 1K<n<10K
license: mit
task_categories:
- text-generation
---
Куча всяких рандомных порно рассказов из интеренета, из разных тегов, но без жести. |
zirui3/webMedQA-instructions | ---
license: cc-by-4.0
---
# summary
A Chinese medical question answering instructions dataset based on `webMedQA`
# Reference
[1]. Applying deep matching networks to Chinese medical question answering: A study and a dataset |
ydang/jsd_dataset | ---
license: cc
---
|
TingChen-ppmc/Shanghai_Dialect_Conversational_Speech_Corpus | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: gender
dtype: string
- name: speaker_id
dtype: string
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 422057259.808
num_examples: 3792
download_size: 436738370
dataset_size: 422057259.808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Corpus
This dataset is built from Magicdata [ASR-CZDIACSC: A CHINESE SHANGHAI DIALECT CONVERSATIONAL SPEECH CORPUS](https://magichub.com/datasets/shanghai-dialect-conversational-speech-corpus/)
This corpus is licensed under a [Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License](http://creativecommons.org/licenses/by-nc-nd/4.0/). Please refer to the license for further information.
Modifications: The audio is split in sentences based on the time span on the transcription file. Sentences that span less than 1 second is discarded. Topics of conversation is removed.
# Usage
To load this dataset, use
```python
from datasets import load_dataset
dialect_corpus = load_dataset("TingChen-ppmc/Shanghai_Dialect_Conversational_Speech_Corpus")
```
This dataset only has train split. To split out a test split, use
```python
from datasets import load_dataset
train_split = load_dataset("TingChen-ppmc/Shanghai_Dialect_Conversational_Speech_Corpus", split="train")
# where test=0.5 denotes 0.5 of the dataset will be split to test split
corpus = train_split.train_test_split(test=0.5)
```
A sample data would be
```python
# note this data is from the Nanchang Dialect corpus, the data format is shared
{'audio':
{'path': 'A0001_S001_0_G0001_0.WAV',
'array': array([-0.00030518, -0.00039673,
-0.00036621, ..., -0.00064087,
-0.00015259, -0.00042725]),
'sampling_rate': 16000},
'gender': '女',
'speaker_id': 'G0001',
'transcription': '北京爱数智慧语音采集'
}
```
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arpanetus/hackernews_title_upvote_0 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 13840984
num_examples: 15064
download_size: 8346861
dataset_size: 13840984
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "hackernews_title_upvote_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
takeraparterer/Become-A-Living-God-Forum | ---
license: cc-by-nc-sa-3.0
---
# Become a living god forums dataset
This is a dataset of posts scraped from the top 100 new pages of the spirituality forum https://forum.becomealivinggod.com/
- **Developed by:** Takeraparterer
- **License:** Creative Commons Attribution-NonCommercial-ShareAlike 3.0
Please contact me via community tab if you wish for this to be taken down.
## Format:
```
{"Post_Id - Title":[["username","message"], ...] ...}
``` |
linhqyy/data_aug_syllable | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: sentence
dtype: string
- name: intent
dtype: string
- name: entities
list:
- name: type
dtype: string
- name: filler
dtype: string
- name: labels
dtype: string
splits:
- name: train
num_bytes: 2380518
num_examples: 11340
- name: test
num_bytes: 125890
num_examples: 597
download_size: 579180
dataset_size: 2506408
---
# Dataset Card for "data_aug_syllable"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
weijie210/UC_original_iter_0 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train_sft
num_bytes: 223365978
num_examples: 100002
- name: test_sft
num_bytes: 45224701
num_examples: 20406
download_size: 137842275
dataset_size: 268590679
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
---
|
filipsch/fashion_image_caption-100-v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 22820471.0
num_examples: 100
download_size: 22820373
dataset_size: 22820471.0
---
# Dataset Card for "fashion_image_caption-100-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bvallegc/spoofing_detection_data_proccessed | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: speaker_id
dtype: string
- name: system_id
dtype: string
- name: label
dtype:
class_label:
names:
'0': bonafide
'1': spoof
- name: input_values
sequence: float32
- name: attention_mask
sequence: int32
splits:
- name: train
num_bytes: 10001392270
num_examples: 22842
- name: test
num_bytes: 1128734898
num_examples: 2538
download_size: 4762954824
dataset_size: 11130127168
---
# Dataset Card for "spoofing_detection_data_proccessed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
medal | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10M<n<100M
source_datasets:
- original
task_categories:
- other
task_ids: []
paperswithcode_id: medal
pretty_name: MeDAL
tags:
- disambiguation
dataset_info:
features:
- name: abstract_id
dtype: int32
- name: text
dtype: string
- name: location
sequence: int32
- name: label
sequence: string
splits:
- name: train
num_bytes: 3573399948
num_examples: 3000000
- name: test
num_bytes: 1190766821
num_examples: 1000000
- name: validation
num_bytes: 1191410723
num_examples: 1000000
- name: full
num_bytes: 15536883723
num_examples: 14393619
download_size: 21060929078
dataset_size: 21492461215
---
# Dataset Card for the MeDAL dataset
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Repository:** https://github.com/BruceWen120/medal
- **Paper:** https://www.aclweb.org/anthology/2020.clinicalnlp-1.15/
- **Dataset (Kaggle):** https://www.kaggle.com/xhlulu/medal-emnlp
- **Dataset (Zenodo):** https://zenodo.org/record/4265632
- **Pretrained model:** https://huggingface.co/xhlu/electra-medal
- **Leaderboard:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
A large medical text dataset (14Go) curated to 4Go for abbreviation disambiguation, designed for natural language understanding pre-training in the medical domain. For example, DHF can be disambiguated to dihydrofolate, diastolic heart failure, dengue hemorragic fever or dihydroxyfumarate
### Supported Tasks and Leaderboards
Medical abbreviation disambiguation
### Languages
English (en)
## Dataset Structure
Each file is a table consisting of three columns:
* text: The normalized content of an abstract
* location: The location (index) of each abbreviation that was substituted
* label: The word at that was substituted at the given location
### Data Instances
An example from the train split is:
```
{'abstract_id': 14145090,
'text': 'velvet antlers vas are commonly used in traditional chinese medicine and invigorant and contain many PET components for health promotion the velvet antler peptide svap is one of active components in vas based on structural study the svap interacts with tgfβ receptors and disrupts the tgfβ pathway we hypothesized that svap prevents cardiac fibrosis from pressure overload by blocking tgfβ signaling SDRs underwent TAC tac or a sham operation T3 one month rats received either svap mgkgday or vehicle for an additional one month tac surgery induced significant cardiac dysfunction FB activation and fibrosis these effects were improved by treatment with svap in the heart tissue tac remarkably increased the expression of tgfβ and connective tissue growth factor ctgf ROS species C2 and the phosphorylation C2 of smad and ERK kinases erk svap inhibited the increases in reactive oxygen species C2 ctgf expression and the phosphorylation of smad and erk but not tgfβ expression in cultured cardiac fibroblasts angiotensin ii ang ii had similar effects compared to tac surgery such as increases in αsmapositive CFs and collagen synthesis svap eliminated these effects by disrupting tgfβ IB to its receptors and blocking ang iitgfβ downstream signaling these results demonstrated that svap has antifibrotic effects by blocking the tgfβ pathway in CFs',
'location': [63],
'label': ['transverse aortic constriction']}
```
### Data Fields
The column types are:
* text: content of the abstract as a string
* location: index of the substitution as an integer
* label: substitued word as a string
### Data Splits
The following files are present:
* `full_data.csv`: The full dataset with all 14M abstracts.
* `train.csv`: The subset used to train the baseline and proposed models.
* `valid.csv`: The subset used to validate the model during training for hyperparameter selection.
* `test.csv`: The subset used to evaluate the model and report the results in the tables.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
The original dataset was retrieved and modified from the [NLM website](https://www.nlm.nih.gov/databases/download/pubmed_medline.html).
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
Details on how the abbreviations were created can be found in section 2.2 (Dataset Creation) of the [ACL ClinicalNLP paper](https://aclanthology.org/2020.clinicalnlp-1.15.pdf).
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
Since the abstracts are written in English, the data is biased towards anglo-centric medical research. If you plan to use a model pre-trained on this dataset for a predominantly non-English community, it is important to verify whether there are negative biases present in your model, and ensure that they are correctly mitigated. For instance, you could fine-tune your dataset on a multilingual medical disambiguation dataset, or collect a dataset specific to your use case.
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
The ELECTRA model is licensed under [Apache 2.0](https://github.com/google-research/electra/blob/master/LICENSE). The license for the libraries used in this project (`transformers`, `pytorch`, etc.) can be found in their respective GitHub repository. Our model is released under a MIT license.
The original dataset was retrieved and modified from the [NLM website](https://www.nlm.nih.gov/databases/download/pubmed_medline.html). By using this dataset, you are bound by the [terms and conditions](https://www.nlm.nih.gov/databases/download/terms_and_conditions_pubmed.html) specified by NLM:
> INTRODUCTION
>
> Downloading data from the National Library of Medicine FTP servers indicates your acceptance of the following Terms and Conditions: No charges, usage fees or royalties are paid to NLM for this data.
>
> MEDLINE/PUBMED SPECIFIC TERMS
>
> NLM freely provides PubMed/MEDLINE data. Please note some PubMed/MEDLINE abstracts may be protected by copyright.
>
> GENERAL TERMS AND CONDITIONS
>
> * Users of the data agree to:
> * acknowledge NLM as the source of the data by including the phrase "Courtesy of the U.S. National Library of Medicine" in a clear and conspicuous manner,
> * properly use registration and/or trademark symbols when referring to NLM products, and
> * not indicate or imply that NLM has endorsed its products/services/applications.
>
> * Users who republish or redistribute the data (services, products or raw data) agree to:
> * maintain the most current version of all distributed data, or
> * make known in a clear and conspicuous manner that the products/services/applications do not reflect the most current/accurate data available from NLM.
>
> * These data are produced with a reasonable standard of care, but NLM makes no warranties express or implied, including no warranty of merchantability or fitness for particular purpose, regarding the accuracy or completeness of the data. Users agree to hold NLM and the U.S. Government harmless from any liability resulting from errors in the data. NLM disclaims any liability for any consequences due to use, misuse, or interpretation of information contained or not contained in the data.
>
> * NLM does not provide legal advice regarding copyright, fair use, or other aspects of intellectual property rights. See the NLM Copyright page.
>
> * NLM reserves the right to change the type and format of its machine-readable data. NLM will take reasonable steps to inform users of any changes to the format of the data before the data are distributed via the announcement section or subscription to email and RSS updates.
### Citation Information
```
@inproceedings{wen-etal-2020-medal,
title = "{M}e{DAL}: Medical Abbreviation Disambiguation Dataset for Natural Language Understanding Pretraining",
author = "Wen, Zhi and
Lu, Xing Han and
Reddy, Siva",
booktitle = "Proceedings of the 3rd Clinical Natural Language Processing Workshop",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.clinicalnlp-1.15",
pages = "130--135",
abstract = "One of the biggest challenges that prohibit the use of many current NLP methods in clinical settings is the availability of public datasets. In this work, we present MeDAL, a large medical text dataset curated for abbreviation disambiguation, designed for natural language understanding pre-training in the medical domain. We pre-trained several models of common architectures on this dataset and empirically showed that such pre-training leads to improved performance and convergence speed when fine-tuning on downstream medical tasks.",
}
```
### Contributions
Thanks to [@Narsil](https://github.com/Narsil) and [@xhlulu](https://github.com/xhlulu)) for adding this dataset. |
malaysia-ai/mosaic-tinyllama | ---
language:
- ms
---
# Mosaic format for filtered combine dataset to finetune TinyLlama models
This repository is to store dataset shards using mosaic format.
1. https://github.com/malaysia-ai/dedup-text-dataset/blob/main/tinyllama/combine-dataset.ipynb
2. using tokenizer https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-955k-token-2T
3. 4096 context length.
## how-to
1. git clone,
```bash
git lfs clone https://huggingface.co/datasets/malaysia-ai/mosaic-tinyllama
```
2. load it,
```python
from streaming import LocalDataset
import numpy as np
from streaming.base.format.mds.encodings import Encoding, _encodings
class UInt16(Encoding):
def encode(self, obj) -> bytes:
return obj.tobytes()
def decode(self, data: bytes):
return np.frombuffer(data, np.uint16)
_encodings['uint16'] = UInt16
dataset = LocalDataset('mosaic-tinyllama')
len(dataset)
``` |
thisisHJLee/vi_data_made2 | ---
license: apache-2.0
---
|
Malikeh1375/medical-question-answering-datasets | ---
language:
- en
task_categories:
- question-answering
tags:
- medical
- clinical
- healthcare
dataset_info:
- config_name: all-processed
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 276980695
num_examples: 246678
download_size: 0
dataset_size: 276980695
- config_name: chatdoctor_healthcaremagic
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 126454896
num_examples: 112165
download_size: 70518147
dataset_size: 126454896
- config_name: chatdoctor_icliniq
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 7347194
num_examples: 7321
download_size: 4153680
dataset_size: 7347194
- config_name: medical_meadow_cord19
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1336834621
num_examples: 821007
download_size: 752855706
dataset_size: 1336834621
- config_name: medical_meadow_health_advice
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2196957
num_examples: 8676
download_size: 890725
dataset_size: 2196957
- config_name: medical_meadow_medical_flashcards
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 16453987
num_examples: 33955
download_size: 6999958
dataset_size: 16453987
- config_name: medical_meadow_mediqa
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 15690088
num_examples: 2208
download_size: 3719929
dataset_size: 15690088
- config_name: medical_meadow_medqa
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 10225018
num_examples: 10178
download_size: 5505473
dataset_size: 10225018
- config_name: medical_meadow_mmmlu
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1442124
num_examples: 3787
download_size: 685604
dataset_size: 1442124
- config_name: medical_meadow_pubmed_causal
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 846695
num_examples: 2446
download_size: 210947
dataset_size: 846695
- config_name: medical_meadow_wikidoc
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 10224074
num_examples: 10000
download_size: 5593178
dataset_size: 10224074
- config_name: medical_meadow_wikidoc_patient_information
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 3262558
num_examples: 5942
download_size: 1544286
dataset_size: 3262558
configs:
- config_name: all-processed
data_files:
- split: train
path: all-processed/train-*
- config_name: chatdoctor_healthcaremagic
data_files:
- split: train
path: chatdoctor_healthcaremagic/train-*
- config_name: chatdoctor_icliniq
data_files:
- split: train
path: chatdoctor_icliniq/train-*
- config_name: medical_meadow_cord19
data_files:
- split: train
path: medical_meadow_cord19/train-*
- config_name: medical_meadow_health_advice
data_files:
- split: train
path: medical_meadow_health_advice/train-*
- config_name: medical_meadow_medical_flashcards
data_files:
- split: train
path: medical_meadow_medical_flashcards/train-*
- config_name: medical_meadow_mediqa
data_files:
- split: train
path: medical_meadow_mediqa/train-*
- config_name: medical_meadow_medqa
data_files:
- split: train
path: medical_meadow_medqa/train-*
- config_name: medical_meadow_mmmlu
data_files:
- split: train
path: medical_meadow_mmmlu/train-*
- config_name: medical_meadow_pubmed_causal
data_files:
- split: train
path: medical_meadow_pubmed_causal/train-*
- config_name: medical_meadow_wikidoc
data_files:
- split: train
path: medical_meadow_wikidoc/train-*
- config_name: medical_meadow_wikidoc_patient_information
data_files:
- split: train
path: medical_meadow_wikidoc_patient_information/train-*
---
|
Patt/ReCoRD_TH | ---
task_categories:
- text-classification
language:
- en
- th
license: cc-by-sa-4.0
---
# Dataset Card for ReCoRD_TH
### Dataset Description
This dataset is Thai translated version of [ReCoRD](https://huggingface.co/datasets/super_glue/viewer/record) using google translate with [Multilingual Universal Sentence Encoder](https://arxiv.org/abs/1907.04307) to calculate score for Thai translation. |
VuongQuoc/60k_dataset_multichoice | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: token_type_ids
sequence:
sequence: int8
- name: attention_mask
sequence:
sequence: int8
- name: label
dtype: int64
splits:
- name: train
num_bytes: 465592764
num_examples: 60000
- name: test
num_bytes: 1552000
num_examples: 200
download_size: 52157007
dataset_size: 467144764
---
# Dataset Card for "60k_dataset_multichoice"
- MAX_LEN = 256
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SamAct/medium_cleaned | ---
license: unlicense
---
|
mariakmurphy55/titanicdata | ---
language:
- en
pretty_name: titanic data
size_categories:
- 1K<n<10K
---
# Dataset Card for Titanic Data
Training and testing data for Titanic passengers' survival.
## Dataset Details
### Dataset Description
Train:
- Dimensions --> 891x12
- Column names --> "PassengerId", "Survived", "Pclass", "Name", "Sex", "Age", "SibSp", "Parch", "Ticket", "Fare", "Cabin", and "Embarked"
Test:
- Dimensions --> 418x11
- Column names --> "PassengerId", "Pclass", "Name", "Sex", "Age", "SibSp", "Parch", "Ticket", "Fare", "Cabin", and "Embarked"
### Dataset Sources
Kaggle Titanic dataset
https://www.kaggle.com/competitions/titanic
## Uses
Raw datasets being used in introduction to DVC and Amazon's S3 buckets.
## Dataset Structure
# Column definitions:
- "PassengerId" --> key for each passenger (int64)
- "Survived" --> binary variable indicating survival (int64)
- "Pclass" --> first, second, or third class (int64)
- "Name" --> passenger name; maiden name in parentheses for married women (object)
- "Sex" --> male or female (object)
- "Age" --> passenger age (float64)
- "SibSp" --> unknown meaning (int64)
- "Parch" --> unknown meaning (int64)
- "Ticket" --> ticket identifier (object)
- "Fare" --> float variable (float64)
- "Cabin" --> cabin identifier (object)
- "Embarked" --> C, Q, or S (object)
Categorical columns: "Name", "Sex", "Ticket", "Cabin", "Embarked"
Continuous columns: "PassengerId", "Pclass", "SibSp", "Parch", "Age", "Fare"
# Quick Facts:
Train:
- PassengerID, Survived, Pclass, Name, Sex, SibSp, Parch, Ticket, and Fare have no NA values
- Age not documented for 177 passengers (19.8653% NA)
- Cabin not documented for 687 passengers (77.1044% NA)
- Embarked not documented for 2 passengers (0.2245% NA)
Test:
- PassengerID, Pclass, Name, Sex, SibSp, Parch, Ticket, and Embarked have no NA values
- Age not documented for 86 passengers (20.5742% NA)
- Fare not documented for 1 passenger (0.2392% NA)
- Cabin not documented for 387 passengers (78.2297% NA)
# Summary Statistics:
Train:

Test:

## Dataset Card Author
Maria Murphy |
mstz/mammography | ---
language:
- en
tags:
- mammography
- tabular_classification
- binary_classification
- UCI
pretty_name: Mammography
size_categories:
- n<1K
task_categories:
- tabular-classification
configs:
- mammography
license: cc
---
# Mammography
The [Mammography dataset](https://archive.ics.uci.edu/ml/datasets/Mammography) from the [UCI ML repository](https://archive.ics.uci.edu/ml/datasets).
# Configurations and tasks
| **Configuration** | **Task** | **Description** |
|-------------------|---------------------------|------------------------|
| mammography | Binary classification | Is the lesion benign? |
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/mammography")["train"]
``` |
danaroth/samson | ---
license: unknown
---
# Description
Samson is a simple dataset that is available from the [website](http://opticks.org/confluence/display/opticks/Sample+Data). In this image, there are 952x952 pixels. Each pixel is recorded at 156 channels covering the wavelengths from 401 nm to 889 nm. The spectral resolution is highly up to 3.13 nm. As the original image is too large, which is very expensive in terms of computational cost, a region of 95x95 pixels is used. It starts from the (252,332)-th pixel in the original image. This data is not degraded by the blank channel or badly noised channels. Specifically, there are three targets in this image, i.e. "#1 Soil", "#2 Tree" and "#3 Water" respectively.
# Quick look
<figure>
<img src= "assets/D7_1.png" alt="Samson" width="500" />
<figcaption>Samson and its ground truths.</figcaption>
</figure>
# Credits
Dataset originally made available by [Opticks](https://www.opticks.org/). |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-4000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1053763
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alexmaraval/gsm8k_objective_examples | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: CoT_example
dtype: string
- name: rationale
dtype: string
- name: CoT_embedding
sequence: float64
- name: question_embedding
sequence: float64
- name: rationale_embedding
sequence: float64
- name: answer_embedding
sequence: float64
splits:
- name: train
num_bytes: 2595606.0350595475
num_examples: 100
download_size: 1932749
dataset_size: 2595606.0350595475
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Myashka/SO-Python_basics_QA-filtered-2023-tanh_score | ---
license: mit
language:
- en
---
SO dataset of python tag data and "Python basics and Envirinment" subcategory
Question filters:
- images
- links
- code blocks
- Q_Score > 0
- Answer_count > 0
Answers filters:
- images
- links
- code blocks
-
Scores are tanh applied to scaled with AbsMaxScaler to IQR range of Original SO Answers' scores |
ManavSinghal157/NoFunEval | ---
license: mit
configs:
- config_name: default
data_files:
- split: latency
path: "datasets/latency.jsonl"
- split: resource_util
path: "datasets/resource_util.jsonl"
- split: runtime_efficiency
path: "datasets/runtime_efficiency.jsonl"
- split: maintainability
path: "datasets/maintainability.jsonl"
- split: security
path: "datasets/security.jsonl"
- split: humanevalclassify
path: "datasets/humanevalclassify.jsonl"
---
# NoFunEval: Funny How Code LMs Falter on Requirements Beyond Functional Correctness
## Abstract:
Existing evaluation benchmarks of language models of code (code LMs) focus almost exclusively on whether the LMs can generate functionally-correct code. In real-world software engineering, developers think beyond functional correctness. They have requirements on "how" a functionality should be implemented to meet overall system design objectives like efficiency, security, and maintainability. They would also trust the code LMs more if the LMs demonstrate robust understanding of requirements and code semantics.
We propose a new benchmark NoFunEval to evaluate code LMs on non-functional requirements and simple classification instances for both functional and non-functional requirements. We propose a prompting method, Coding Concepts (CoCo), as a way for a developer to communicate the domain knowledge to the LMs. We conduct an extensive evaluation of twenty-two code LMs. Our finding is that they generally falter when tested on our benchmark, hinting at fundamental blindspots in their training setups. Surprisingly, even the classification accuracy on functional-correctness instances derived from the popular HumanEval benchmark is low, calling in question the depth of their comprehension and the source of their success in generating functionally-correct code in the first place.
Arxiv Link: https://arxiv.org/pdf/2401.15963.pdf
[Work on code release is under progress.]
# Generation
## Environment Setup
Create a virtual environment.
```console
bash setup.sh
```
### NoFunEdit
```console
python3 src/nofunedit_generation.py --data_subset <subset from nofunedit: eg-latency> --model_path <model name from HF: eg-WizardLM/WizardCoder-15B-V1.0> --temperature <temperature to be set for model generation: eg-0> --max_new_tokens <maximum number of new tokens to be generated: eg-5192> --prompt <type of prompt to use from our dataset: eg-base_prompt> --num_samples <number of samples to be generated: eg-1> --precision <floating point format: eg-fp16> --batch_size <number of examples to send to llm engine at once: eg-1>
```
### Classification
```console
python3 src/classification_generation.py --data_subset <subset from non_func or humanevalclassify: eg-latency> --model <model name from HF: eg-WizardLM/WizardCoder-15B-V1.0> --temperature <temperature to be set for model generation: eg-0> --max_new_tokens <maximum number of new tokens to be generated: eg-5192> --prompt <type of prompt to use from our dataset: eg-base_prompt> --precision <floating point format: eg-fp16> --batch_size <number of examples to send to llm engine at once: eg-1>
```
# Evaluation Scripts
## Evaluation
```console
python3 src/evaluation.py --data_subset <subset from nofunedit: eg-latency> --model_path <model name from HF: eg-WizardLM/WizardCoder-15B-V1.0> --prompt <type of prompt to use from our dataset: eg-base_prompt> --num_samples <number of samples to be generated: eg-1> --score_k <K values for score@k: eg-1,5,10,20> --metric <eval_metric to be used: eg-diffbleu>
```
### Example eval script (For maintainability)
```console
bash evaluation_example_script.sh
```
## Parameters
| Parameter | Description |
| ----------------------------- | ---------------------------------------- |
| `data_subset` | The subset of data to use. Options: `latency`, `resource_util`, `maintainability`, `security`, `runtime_efficiency` for nofunedit. Additionally `humanevalclassify` for classification.|
| `model_path` | The path of the model from HF. Example: `WizardLM/WizardCoder-15B-V1.0`.
| `prompt` | Prompt to use. Options: `base_prompt`, `one-shot`, `chain_of_thought`, `coding_concepts`. |
| `num_samples` | Number of samples to generate. Example: `1` (We used `1` for greedy and `20` for higher temperature). **[NoFunEdit - Generation only]**|
| `max_new_tokens` | Budget for new token generation for a model. Example: `1200` (NoFunEdit: We used `1200` for runtime_efficiency and security for all prompts than CoT where `1500` was used. For others, we used `5192` or max possible limit. Classification: We used `4` for all generations).|
| `temperature` | Temperature for model generation. Example: `0` (We used `0` for greedy and `0.8` for higher samples) |
| `score_k` |K vales for Score@K. Example: `1,5,10,20` (Should not be greater than num_samples and is comma separated) **[Eval only]** |
| `metric` | Metric to be used for evaluation. Option: `diffbleu`, `codeql`, `codeql-diffbleu` (to be run after first two params are run), `classification`, `runtime` **[Eval only]**|
#### VLLM Parameters (for generation)
| Parameter | Description |
| ----------------------------- | ---------------------------------------- |
| `batch-size` | Batch size. Default: `1`|
| `precision` | Floating point format: Default: `fp16` |
| `tensor_parallel_size` | Default: `1` |
| `swap_space` | The size (GiB) of CPU memory per GPU to use as swap space: Default: `4` |
|
alisson40889/RAMBU | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_rte_serial_verb_give | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 3447
num_examples: 7
- name: train
num_bytes: 4199
num_examples: 8
download_size: 15341
dataset_size: 7646
---
# Dataset Card for "MULTI_VALUE_rte_serial_verb_give"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
emozilla/yarn-train-tokenized-32k-mistral | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 44335107704
num_examples: 104074
download_size: 12138496030
dataset_size: 44335107704
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "yarn-train-tokenized-32k-mistral"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DylanJHJ/START | ---
license: apache-2.0
---
|
EgilKarlsen/PKDD_GPTNEO_Finetuned | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: '768'
dtype: float32
- name: '769'
dtype: float32
- name: '770'
dtype: float32
- name: '771'
dtype: float32
- name: '772'
dtype: float32
- name: '773'
dtype: float32
- name: '774'
dtype: float32
- name: '775'
dtype: float32
- name: '776'
dtype: float32
- name: '777'
dtype: float32
- name: '778'
dtype: float32
- name: '779'
dtype: float32
- name: '780'
dtype: float32
- name: '781'
dtype: float32
- name: '782'
dtype: float32
- name: '783'
dtype: float32
- name: '784'
dtype: float32
- name: '785'
dtype: float32
- name: '786'
dtype: float32
- name: '787'
dtype: float32
- name: '788'
dtype: float32
- name: '789'
dtype: float32
- name: '790'
dtype: float32
- name: '791'
dtype: float32
- name: '792'
dtype: float32
- name: '793'
dtype: float32
- name: '794'
dtype: float32
- name: '795'
dtype: float32
- name: '796'
dtype: float32
- name: '797'
dtype: float32
- name: '798'
dtype: float32
- name: '799'
dtype: float32
- name: '800'
dtype: float32
- name: '801'
dtype: float32
- name: '802'
dtype: float32
- name: '803'
dtype: float32
- name: '804'
dtype: float32
- name: '805'
dtype: float32
- name: '806'
dtype: float32
- name: '807'
dtype: float32
- name: '808'
dtype: float32
- name: '809'
dtype: float32
- name: '810'
dtype: float32
- name: '811'
dtype: float32
- name: '812'
dtype: float32
- name: '813'
dtype: float32
- name: '814'
dtype: float32
- name: '815'
dtype: float32
- name: '816'
dtype: float32
- name: '817'
dtype: float32
- name: '818'
dtype: float32
- name: '819'
dtype: float32
- name: '820'
dtype: float32
- name: '821'
dtype: float32
- name: '822'
dtype: float32
- name: '823'
dtype: float32
- name: '824'
dtype: float32
- name: '825'
dtype: float32
- name: '826'
dtype: float32
- name: '827'
dtype: float32
- name: '828'
dtype: float32
- name: '829'
dtype: float32
- name: '830'
dtype: float32
- name: '831'
dtype: float32
- name: '832'
dtype: float32
- name: '833'
dtype: float32
- name: '834'
dtype: float32
- name: '835'
dtype: float32
- name: '836'
dtype: float32
- name: '837'
dtype: float32
- name: '838'
dtype: float32
- name: '839'
dtype: float32
- name: '840'
dtype: float32
- name: '841'
dtype: float32
- name: '842'
dtype: float32
- name: '843'
dtype: float32
- name: '844'
dtype: float32
- name: '845'
dtype: float32
- name: '846'
dtype: float32
- name: '847'
dtype: float32
- name: '848'
dtype: float32
- name: '849'
dtype: float32
- name: '850'
dtype: float32
- name: '851'
dtype: float32
- name: '852'
dtype: float32
- name: '853'
dtype: float32
- name: '854'
dtype: float32
- name: '855'
dtype: float32
- name: '856'
dtype: float32
- name: '857'
dtype: float32
- name: '858'
dtype: float32
- name: '859'
dtype: float32
- name: '860'
dtype: float32
- name: '861'
dtype: float32
- name: '862'
dtype: float32
- name: '863'
dtype: float32
- name: '864'
dtype: float32
- name: '865'
dtype: float32
- name: '866'
dtype: float32
- name: '867'
dtype: float32
- name: '868'
dtype: float32
- name: '869'
dtype: float32
- name: '870'
dtype: float32
- name: '871'
dtype: float32
- name: '872'
dtype: float32
- name: '873'
dtype: float32
- name: '874'
dtype: float32
- name: '875'
dtype: float32
- name: '876'
dtype: float32
- name: '877'
dtype: float32
- name: '878'
dtype: float32
- name: '879'
dtype: float32
- name: '880'
dtype: float32
- name: '881'
dtype: float32
- name: '882'
dtype: float32
- name: '883'
dtype: float32
- name: '884'
dtype: float32
- name: '885'
dtype: float32
- name: '886'
dtype: float32
- name: '887'
dtype: float32
- name: '888'
dtype: float32
- name: '889'
dtype: float32
- name: '890'
dtype: float32
- name: '891'
dtype: float32
- name: '892'
dtype: float32
- name: '893'
dtype: float32
- name: '894'
dtype: float32
- name: '895'
dtype: float32
- name: '896'
dtype: float32
- name: '897'
dtype: float32
- name: '898'
dtype: float32
- name: '899'
dtype: float32
- name: '900'
dtype: float32
- name: '901'
dtype: float32
- name: '902'
dtype: float32
- name: '903'
dtype: float32
- name: '904'
dtype: float32
- name: '905'
dtype: float32
- name: '906'
dtype: float32
- name: '907'
dtype: float32
- name: '908'
dtype: float32
- name: '909'
dtype: float32
- name: '910'
dtype: float32
- name: '911'
dtype: float32
- name: '912'
dtype: float32
- name: '913'
dtype: float32
- name: '914'
dtype: float32
- name: '915'
dtype: float32
- name: '916'
dtype: float32
- name: '917'
dtype: float32
- name: '918'
dtype: float32
- name: '919'
dtype: float32
- name: '920'
dtype: float32
- name: '921'
dtype: float32
- name: '922'
dtype: float32
- name: '923'
dtype: float32
- name: '924'
dtype: float32
- name: '925'
dtype: float32
- name: '926'
dtype: float32
- name: '927'
dtype: float32
- name: '928'
dtype: float32
- name: '929'
dtype: float32
- name: '930'
dtype: float32
- name: '931'
dtype: float32
- name: '932'
dtype: float32
- name: '933'
dtype: float32
- name: '934'
dtype: float32
- name: '935'
dtype: float32
- name: '936'
dtype: float32
- name: '937'
dtype: float32
- name: '938'
dtype: float32
- name: '939'
dtype: float32
- name: '940'
dtype: float32
- name: '941'
dtype: float32
- name: '942'
dtype: float32
- name: '943'
dtype: float32
- name: '944'
dtype: float32
- name: '945'
dtype: float32
- name: '946'
dtype: float32
- name: '947'
dtype: float32
- name: '948'
dtype: float32
- name: '949'
dtype: float32
- name: '950'
dtype: float32
- name: '951'
dtype: float32
- name: '952'
dtype: float32
- name: '953'
dtype: float32
- name: '954'
dtype: float32
- name: '955'
dtype: float32
- name: '956'
dtype: float32
- name: '957'
dtype: float32
- name: '958'
dtype: float32
- name: '959'
dtype: float32
- name: '960'
dtype: float32
- name: '961'
dtype: float32
- name: '962'
dtype: float32
- name: '963'
dtype: float32
- name: '964'
dtype: float32
- name: '965'
dtype: float32
- name: '966'
dtype: float32
- name: '967'
dtype: float32
- name: '968'
dtype: float32
- name: '969'
dtype: float32
- name: '970'
dtype: float32
- name: '971'
dtype: float32
- name: '972'
dtype: float32
- name: '973'
dtype: float32
- name: '974'
dtype: float32
- name: '975'
dtype: float32
- name: '976'
dtype: float32
- name: '977'
dtype: float32
- name: '978'
dtype: float32
- name: '979'
dtype: float32
- name: '980'
dtype: float32
- name: '981'
dtype: float32
- name: '982'
dtype: float32
- name: '983'
dtype: float32
- name: '984'
dtype: float32
- name: '985'
dtype: float32
- name: '986'
dtype: float32
- name: '987'
dtype: float32
- name: '988'
dtype: float32
- name: '989'
dtype: float32
- name: '990'
dtype: float32
- name: '991'
dtype: float32
- name: '992'
dtype: float32
- name: '993'
dtype: float32
- name: '994'
dtype: float32
- name: '995'
dtype: float32
- name: '996'
dtype: float32
- name: '997'
dtype: float32
- name: '998'
dtype: float32
- name: '999'
dtype: float32
- name: '1000'
dtype: float32
- name: '1001'
dtype: float32
- name: '1002'
dtype: float32
- name: '1003'
dtype: float32
- name: '1004'
dtype: float32
- name: '1005'
dtype: float32
- name: '1006'
dtype: float32
- name: '1007'
dtype: float32
- name: '1008'
dtype: float32
- name: '1009'
dtype: float32
- name: '1010'
dtype: float32
- name: '1011'
dtype: float32
- name: '1012'
dtype: float32
- name: '1013'
dtype: float32
- name: '1014'
dtype: float32
- name: '1015'
dtype: float32
- name: '1016'
dtype: float32
- name: '1017'
dtype: float32
- name: '1018'
dtype: float32
- name: '1019'
dtype: float32
- name: '1020'
dtype: float32
- name: '1021'
dtype: float32
- name: '1022'
dtype: float32
- name: '1023'
dtype: float32
- name: '1024'
dtype: float32
- name: '1025'
dtype: float32
- name: '1026'
dtype: float32
- name: '1027'
dtype: float32
- name: '1028'
dtype: float32
- name: '1029'
dtype: float32
- name: '1030'
dtype: float32
- name: '1031'
dtype: float32
- name: '1032'
dtype: float32
- name: '1033'
dtype: float32
- name: '1034'
dtype: float32
- name: '1035'
dtype: float32
- name: '1036'
dtype: float32
- name: '1037'
dtype: float32
- name: '1038'
dtype: float32
- name: '1039'
dtype: float32
- name: '1040'
dtype: float32
- name: '1041'
dtype: float32
- name: '1042'
dtype: float32
- name: '1043'
dtype: float32
- name: '1044'
dtype: float32
- name: '1045'
dtype: float32
- name: '1046'
dtype: float32
- name: '1047'
dtype: float32
- name: '1048'
dtype: float32
- name: '1049'
dtype: float32
- name: '1050'
dtype: float32
- name: '1051'
dtype: float32
- name: '1052'
dtype: float32
- name: '1053'
dtype: float32
- name: '1054'
dtype: float32
- name: '1055'
dtype: float32
- name: '1056'
dtype: float32
- name: '1057'
dtype: float32
- name: '1058'
dtype: float32
- name: '1059'
dtype: float32
- name: '1060'
dtype: float32
- name: '1061'
dtype: float32
- name: '1062'
dtype: float32
- name: '1063'
dtype: float32
- name: '1064'
dtype: float32
- name: '1065'
dtype: float32
- name: '1066'
dtype: float32
- name: '1067'
dtype: float32
- name: '1068'
dtype: float32
- name: '1069'
dtype: float32
- name: '1070'
dtype: float32
- name: '1071'
dtype: float32
- name: '1072'
dtype: float32
- name: '1073'
dtype: float32
- name: '1074'
dtype: float32
- name: '1075'
dtype: float32
- name: '1076'
dtype: float32
- name: '1077'
dtype: float32
- name: '1078'
dtype: float32
- name: '1079'
dtype: float32
- name: '1080'
dtype: float32
- name: '1081'
dtype: float32
- name: '1082'
dtype: float32
- name: '1083'
dtype: float32
- name: '1084'
dtype: float32
- name: '1085'
dtype: float32
- name: '1086'
dtype: float32
- name: '1087'
dtype: float32
- name: '1088'
dtype: float32
- name: '1089'
dtype: float32
- name: '1090'
dtype: float32
- name: '1091'
dtype: float32
- name: '1092'
dtype: float32
- name: '1093'
dtype: float32
- name: '1094'
dtype: float32
- name: '1095'
dtype: float32
- name: '1096'
dtype: float32
- name: '1097'
dtype: float32
- name: '1098'
dtype: float32
- name: '1099'
dtype: float32
- name: '1100'
dtype: float32
- name: '1101'
dtype: float32
- name: '1102'
dtype: float32
- name: '1103'
dtype: float32
- name: '1104'
dtype: float32
- name: '1105'
dtype: float32
- name: '1106'
dtype: float32
- name: '1107'
dtype: float32
- name: '1108'
dtype: float32
- name: '1109'
dtype: float32
- name: '1110'
dtype: float32
- name: '1111'
dtype: float32
- name: '1112'
dtype: float32
- name: '1113'
dtype: float32
- name: '1114'
dtype: float32
- name: '1115'
dtype: float32
- name: '1116'
dtype: float32
- name: '1117'
dtype: float32
- name: '1118'
dtype: float32
- name: '1119'
dtype: float32
- name: '1120'
dtype: float32
- name: '1121'
dtype: float32
- name: '1122'
dtype: float32
- name: '1123'
dtype: float32
- name: '1124'
dtype: float32
- name: '1125'
dtype: float32
- name: '1126'
dtype: float32
- name: '1127'
dtype: float32
- name: '1128'
dtype: float32
- name: '1129'
dtype: float32
- name: '1130'
dtype: float32
- name: '1131'
dtype: float32
- name: '1132'
dtype: float32
- name: '1133'
dtype: float32
- name: '1134'
dtype: float32
- name: '1135'
dtype: float32
- name: '1136'
dtype: float32
- name: '1137'
dtype: float32
- name: '1138'
dtype: float32
- name: '1139'
dtype: float32
- name: '1140'
dtype: float32
- name: '1141'
dtype: float32
- name: '1142'
dtype: float32
- name: '1143'
dtype: float32
- name: '1144'
dtype: float32
- name: '1145'
dtype: float32
- name: '1146'
dtype: float32
- name: '1147'
dtype: float32
- name: '1148'
dtype: float32
- name: '1149'
dtype: float32
- name: '1150'
dtype: float32
- name: '1151'
dtype: float32
- name: '1152'
dtype: float32
- name: '1153'
dtype: float32
- name: '1154'
dtype: float32
- name: '1155'
dtype: float32
- name: '1156'
dtype: float32
- name: '1157'
dtype: float32
- name: '1158'
dtype: float32
- name: '1159'
dtype: float32
- name: '1160'
dtype: float32
- name: '1161'
dtype: float32
- name: '1162'
dtype: float32
- name: '1163'
dtype: float32
- name: '1164'
dtype: float32
- name: '1165'
dtype: float32
- name: '1166'
dtype: float32
- name: '1167'
dtype: float32
- name: '1168'
dtype: float32
- name: '1169'
dtype: float32
- name: '1170'
dtype: float32
- name: '1171'
dtype: float32
- name: '1172'
dtype: float32
- name: '1173'
dtype: float32
- name: '1174'
dtype: float32
- name: '1175'
dtype: float32
- name: '1176'
dtype: float32
- name: '1177'
dtype: float32
- name: '1178'
dtype: float32
- name: '1179'
dtype: float32
- name: '1180'
dtype: float32
- name: '1181'
dtype: float32
- name: '1182'
dtype: float32
- name: '1183'
dtype: float32
- name: '1184'
dtype: float32
- name: '1185'
dtype: float32
- name: '1186'
dtype: float32
- name: '1187'
dtype: float32
- name: '1188'
dtype: float32
- name: '1189'
dtype: float32
- name: '1190'
dtype: float32
- name: '1191'
dtype: float32
- name: '1192'
dtype: float32
- name: '1193'
dtype: float32
- name: '1194'
dtype: float32
- name: '1195'
dtype: float32
- name: '1196'
dtype: float32
- name: '1197'
dtype: float32
- name: '1198'
dtype: float32
- name: '1199'
dtype: float32
- name: '1200'
dtype: float32
- name: '1201'
dtype: float32
- name: '1202'
dtype: float32
- name: '1203'
dtype: float32
- name: '1204'
dtype: float32
- name: '1205'
dtype: float32
- name: '1206'
dtype: float32
- name: '1207'
dtype: float32
- name: '1208'
dtype: float32
- name: '1209'
dtype: float32
- name: '1210'
dtype: float32
- name: '1211'
dtype: float32
- name: '1212'
dtype: float32
- name: '1213'
dtype: float32
- name: '1214'
dtype: float32
- name: '1215'
dtype: float32
- name: '1216'
dtype: float32
- name: '1217'
dtype: float32
- name: '1218'
dtype: float32
- name: '1219'
dtype: float32
- name: '1220'
dtype: float32
- name: '1221'
dtype: float32
- name: '1222'
dtype: float32
- name: '1223'
dtype: float32
- name: '1224'
dtype: float32
- name: '1225'
dtype: float32
- name: '1226'
dtype: float32
- name: '1227'
dtype: float32
- name: '1228'
dtype: float32
- name: '1229'
dtype: float32
- name: '1230'
dtype: float32
- name: '1231'
dtype: float32
- name: '1232'
dtype: float32
- name: '1233'
dtype: float32
- name: '1234'
dtype: float32
- name: '1235'
dtype: float32
- name: '1236'
dtype: float32
- name: '1237'
dtype: float32
- name: '1238'
dtype: float32
- name: '1239'
dtype: float32
- name: '1240'
dtype: float32
- name: '1241'
dtype: float32
- name: '1242'
dtype: float32
- name: '1243'
dtype: float32
- name: '1244'
dtype: float32
- name: '1245'
dtype: float32
- name: '1246'
dtype: float32
- name: '1247'
dtype: float32
- name: '1248'
dtype: float32
- name: '1249'
dtype: float32
- name: '1250'
dtype: float32
- name: '1251'
dtype: float32
- name: '1252'
dtype: float32
- name: '1253'
dtype: float32
- name: '1254'
dtype: float32
- name: '1255'
dtype: float32
- name: '1256'
dtype: float32
- name: '1257'
dtype: float32
- name: '1258'
dtype: float32
- name: '1259'
dtype: float32
- name: '1260'
dtype: float32
- name: '1261'
dtype: float32
- name: '1262'
dtype: float32
- name: '1263'
dtype: float32
- name: '1264'
dtype: float32
- name: '1265'
dtype: float32
- name: '1266'
dtype: float32
- name: '1267'
dtype: float32
- name: '1268'
dtype: float32
- name: '1269'
dtype: float32
- name: '1270'
dtype: float32
- name: '1271'
dtype: float32
- name: '1272'
dtype: float32
- name: '1273'
dtype: float32
- name: '1274'
dtype: float32
- name: '1275'
dtype: float32
- name: '1276'
dtype: float32
- name: '1277'
dtype: float32
- name: '1278'
dtype: float32
- name: '1279'
dtype: float32
- name: '1280'
dtype: float32
- name: '1281'
dtype: float32
- name: '1282'
dtype: float32
- name: '1283'
dtype: float32
- name: '1284'
dtype: float32
- name: '1285'
dtype: float32
- name: '1286'
dtype: float32
- name: '1287'
dtype: float32
- name: '1288'
dtype: float32
- name: '1289'
dtype: float32
- name: '1290'
dtype: float32
- name: '1291'
dtype: float32
- name: '1292'
dtype: float32
- name: '1293'
dtype: float32
- name: '1294'
dtype: float32
- name: '1295'
dtype: float32
- name: '1296'
dtype: float32
- name: '1297'
dtype: float32
- name: '1298'
dtype: float32
- name: '1299'
dtype: float32
- name: '1300'
dtype: float32
- name: '1301'
dtype: float32
- name: '1302'
dtype: float32
- name: '1303'
dtype: float32
- name: '1304'
dtype: float32
- name: '1305'
dtype: float32
- name: '1306'
dtype: float32
- name: '1307'
dtype: float32
- name: '1308'
dtype: float32
- name: '1309'
dtype: float32
- name: '1310'
dtype: float32
- name: '1311'
dtype: float32
- name: '1312'
dtype: float32
- name: '1313'
dtype: float32
- name: '1314'
dtype: float32
- name: '1315'
dtype: float32
- name: '1316'
dtype: float32
- name: '1317'
dtype: float32
- name: '1318'
dtype: float32
- name: '1319'
dtype: float32
- name: '1320'
dtype: float32
- name: '1321'
dtype: float32
- name: '1322'
dtype: float32
- name: '1323'
dtype: float32
- name: '1324'
dtype: float32
- name: '1325'
dtype: float32
- name: '1326'
dtype: float32
- name: '1327'
dtype: float32
- name: '1328'
dtype: float32
- name: '1329'
dtype: float32
- name: '1330'
dtype: float32
- name: '1331'
dtype: float32
- name: '1332'
dtype: float32
- name: '1333'
dtype: float32
- name: '1334'
dtype: float32
- name: '1335'
dtype: float32
- name: '1336'
dtype: float32
- name: '1337'
dtype: float32
- name: '1338'
dtype: float32
- name: '1339'
dtype: float32
- name: '1340'
dtype: float32
- name: '1341'
dtype: float32
- name: '1342'
dtype: float32
- name: '1343'
dtype: float32
- name: '1344'
dtype: float32
- name: '1345'
dtype: float32
- name: '1346'
dtype: float32
- name: '1347'
dtype: float32
- name: '1348'
dtype: float32
- name: '1349'
dtype: float32
- name: '1350'
dtype: float32
- name: '1351'
dtype: float32
- name: '1352'
dtype: float32
- name: '1353'
dtype: float32
- name: '1354'
dtype: float32
- name: '1355'
dtype: float32
- name: '1356'
dtype: float32
- name: '1357'
dtype: float32
- name: '1358'
dtype: float32
- name: '1359'
dtype: float32
- name: '1360'
dtype: float32
- name: '1361'
dtype: float32
- name: '1362'
dtype: float32
- name: '1363'
dtype: float32
- name: '1364'
dtype: float32
- name: '1365'
dtype: float32
- name: '1366'
dtype: float32
- name: '1367'
dtype: float32
- name: '1368'
dtype: float32
- name: '1369'
dtype: float32
- name: '1370'
dtype: float32
- name: '1371'
dtype: float32
- name: '1372'
dtype: float32
- name: '1373'
dtype: float32
- name: '1374'
dtype: float32
- name: '1375'
dtype: float32
- name: '1376'
dtype: float32
- name: '1377'
dtype: float32
- name: '1378'
dtype: float32
- name: '1379'
dtype: float32
- name: '1380'
dtype: float32
- name: '1381'
dtype: float32
- name: '1382'
dtype: float32
- name: '1383'
dtype: float32
- name: '1384'
dtype: float32
- name: '1385'
dtype: float32
- name: '1386'
dtype: float32
- name: '1387'
dtype: float32
- name: '1388'
dtype: float32
- name: '1389'
dtype: float32
- name: '1390'
dtype: float32
- name: '1391'
dtype: float32
- name: '1392'
dtype: float32
- name: '1393'
dtype: float32
- name: '1394'
dtype: float32
- name: '1395'
dtype: float32
- name: '1396'
dtype: float32
- name: '1397'
dtype: float32
- name: '1398'
dtype: float32
- name: '1399'
dtype: float32
- name: '1400'
dtype: float32
- name: '1401'
dtype: float32
- name: '1402'
dtype: float32
- name: '1403'
dtype: float32
- name: '1404'
dtype: float32
- name: '1405'
dtype: float32
- name: '1406'
dtype: float32
- name: '1407'
dtype: float32
- name: '1408'
dtype: float32
- name: '1409'
dtype: float32
- name: '1410'
dtype: float32
- name: '1411'
dtype: float32
- name: '1412'
dtype: float32
- name: '1413'
dtype: float32
- name: '1414'
dtype: float32
- name: '1415'
dtype: float32
- name: '1416'
dtype: float32
- name: '1417'
dtype: float32
- name: '1418'
dtype: float32
- name: '1419'
dtype: float32
- name: '1420'
dtype: float32
- name: '1421'
dtype: float32
- name: '1422'
dtype: float32
- name: '1423'
dtype: float32
- name: '1424'
dtype: float32
- name: '1425'
dtype: float32
- name: '1426'
dtype: float32
- name: '1427'
dtype: float32
- name: '1428'
dtype: float32
- name: '1429'
dtype: float32
- name: '1430'
dtype: float32
- name: '1431'
dtype: float32
- name: '1432'
dtype: float32
- name: '1433'
dtype: float32
- name: '1434'
dtype: float32
- name: '1435'
dtype: float32
- name: '1436'
dtype: float32
- name: '1437'
dtype: float32
- name: '1438'
dtype: float32
- name: '1439'
dtype: float32
- name: '1440'
dtype: float32
- name: '1441'
dtype: float32
- name: '1442'
dtype: float32
- name: '1443'
dtype: float32
- name: '1444'
dtype: float32
- name: '1445'
dtype: float32
- name: '1446'
dtype: float32
- name: '1447'
dtype: float32
- name: '1448'
dtype: float32
- name: '1449'
dtype: float32
- name: '1450'
dtype: float32
- name: '1451'
dtype: float32
- name: '1452'
dtype: float32
- name: '1453'
dtype: float32
- name: '1454'
dtype: float32
- name: '1455'
dtype: float32
- name: '1456'
dtype: float32
- name: '1457'
dtype: float32
- name: '1458'
dtype: float32
- name: '1459'
dtype: float32
- name: '1460'
dtype: float32
- name: '1461'
dtype: float32
- name: '1462'
dtype: float32
- name: '1463'
dtype: float32
- name: '1464'
dtype: float32
- name: '1465'
dtype: float32
- name: '1466'
dtype: float32
- name: '1467'
dtype: float32
- name: '1468'
dtype: float32
- name: '1469'
dtype: float32
- name: '1470'
dtype: float32
- name: '1471'
dtype: float32
- name: '1472'
dtype: float32
- name: '1473'
dtype: float32
- name: '1474'
dtype: float32
- name: '1475'
dtype: float32
- name: '1476'
dtype: float32
- name: '1477'
dtype: float32
- name: '1478'
dtype: float32
- name: '1479'
dtype: float32
- name: '1480'
dtype: float32
- name: '1481'
dtype: float32
- name: '1482'
dtype: float32
- name: '1483'
dtype: float32
- name: '1484'
dtype: float32
- name: '1485'
dtype: float32
- name: '1486'
dtype: float32
- name: '1487'
dtype: float32
- name: '1488'
dtype: float32
- name: '1489'
dtype: float32
- name: '1490'
dtype: float32
- name: '1491'
dtype: float32
- name: '1492'
dtype: float32
- name: '1493'
dtype: float32
- name: '1494'
dtype: float32
- name: '1495'
dtype: float32
- name: '1496'
dtype: float32
- name: '1497'
dtype: float32
- name: '1498'
dtype: float32
- name: '1499'
dtype: float32
- name: '1500'
dtype: float32
- name: '1501'
dtype: float32
- name: '1502'
dtype: float32
- name: '1503'
dtype: float32
- name: '1504'
dtype: float32
- name: '1505'
dtype: float32
- name: '1506'
dtype: float32
- name: '1507'
dtype: float32
- name: '1508'
dtype: float32
- name: '1509'
dtype: float32
- name: '1510'
dtype: float32
- name: '1511'
dtype: float32
- name: '1512'
dtype: float32
- name: '1513'
dtype: float32
- name: '1514'
dtype: float32
- name: '1515'
dtype: float32
- name: '1516'
dtype: float32
- name: '1517'
dtype: float32
- name: '1518'
dtype: float32
- name: '1519'
dtype: float32
- name: '1520'
dtype: float32
- name: '1521'
dtype: float32
- name: '1522'
dtype: float32
- name: '1523'
dtype: float32
- name: '1524'
dtype: float32
- name: '1525'
dtype: float32
- name: '1526'
dtype: float32
- name: '1527'
dtype: float32
- name: '1528'
dtype: float32
- name: '1529'
dtype: float32
- name: '1530'
dtype: float32
- name: '1531'
dtype: float32
- name: '1532'
dtype: float32
- name: '1533'
dtype: float32
- name: '1534'
dtype: float32
- name: '1535'
dtype: float32
- name: '1536'
dtype: float32
- name: '1537'
dtype: float32
- name: '1538'
dtype: float32
- name: '1539'
dtype: float32
- name: '1540'
dtype: float32
- name: '1541'
dtype: float32
- name: '1542'
dtype: float32
- name: '1543'
dtype: float32
- name: '1544'
dtype: float32
- name: '1545'
dtype: float32
- name: '1546'
dtype: float32
- name: '1547'
dtype: float32
- name: '1548'
dtype: float32
- name: '1549'
dtype: float32
- name: '1550'
dtype: float32
- name: '1551'
dtype: float32
- name: '1552'
dtype: float32
- name: '1553'
dtype: float32
- name: '1554'
dtype: float32
- name: '1555'
dtype: float32
- name: '1556'
dtype: float32
- name: '1557'
dtype: float32
- name: '1558'
dtype: float32
- name: '1559'
dtype: float32
- name: '1560'
dtype: float32
- name: '1561'
dtype: float32
- name: '1562'
dtype: float32
- name: '1563'
dtype: float32
- name: '1564'
dtype: float32
- name: '1565'
dtype: float32
- name: '1566'
dtype: float32
- name: '1567'
dtype: float32
- name: '1568'
dtype: float32
- name: '1569'
dtype: float32
- name: '1570'
dtype: float32
- name: '1571'
dtype: float32
- name: '1572'
dtype: float32
- name: '1573'
dtype: float32
- name: '1574'
dtype: float32
- name: '1575'
dtype: float32
- name: '1576'
dtype: float32
- name: '1577'
dtype: float32
- name: '1578'
dtype: float32
- name: '1579'
dtype: float32
- name: '1580'
dtype: float32
- name: '1581'
dtype: float32
- name: '1582'
dtype: float32
- name: '1583'
dtype: float32
- name: '1584'
dtype: float32
- name: '1585'
dtype: float32
- name: '1586'
dtype: float32
- name: '1587'
dtype: float32
- name: '1588'
dtype: float32
- name: '1589'
dtype: float32
- name: '1590'
dtype: float32
- name: '1591'
dtype: float32
- name: '1592'
dtype: float32
- name: '1593'
dtype: float32
- name: '1594'
dtype: float32
- name: '1595'
dtype: float32
- name: '1596'
dtype: float32
- name: '1597'
dtype: float32
- name: '1598'
dtype: float32
- name: '1599'
dtype: float32
- name: '1600'
dtype: float32
- name: '1601'
dtype: float32
- name: '1602'
dtype: float32
- name: '1603'
dtype: float32
- name: '1604'
dtype: float32
- name: '1605'
dtype: float32
- name: '1606'
dtype: float32
- name: '1607'
dtype: float32
- name: '1608'
dtype: float32
- name: '1609'
dtype: float32
- name: '1610'
dtype: float32
- name: '1611'
dtype: float32
- name: '1612'
dtype: float32
- name: '1613'
dtype: float32
- name: '1614'
dtype: float32
- name: '1615'
dtype: float32
- name: '1616'
dtype: float32
- name: '1617'
dtype: float32
- name: '1618'
dtype: float32
- name: '1619'
dtype: float32
- name: '1620'
dtype: float32
- name: '1621'
dtype: float32
- name: '1622'
dtype: float32
- name: '1623'
dtype: float32
- name: '1624'
dtype: float32
- name: '1625'
dtype: float32
- name: '1626'
dtype: float32
- name: '1627'
dtype: float32
- name: '1628'
dtype: float32
- name: '1629'
dtype: float32
- name: '1630'
dtype: float32
- name: '1631'
dtype: float32
- name: '1632'
dtype: float32
- name: '1633'
dtype: float32
- name: '1634'
dtype: float32
- name: '1635'
dtype: float32
- name: '1636'
dtype: float32
- name: '1637'
dtype: float32
- name: '1638'
dtype: float32
- name: '1639'
dtype: float32
- name: '1640'
dtype: float32
- name: '1641'
dtype: float32
- name: '1642'
dtype: float32
- name: '1643'
dtype: float32
- name: '1644'
dtype: float32
- name: '1645'
dtype: float32
- name: '1646'
dtype: float32
- name: '1647'
dtype: float32
- name: '1648'
dtype: float32
- name: '1649'
dtype: float32
- name: '1650'
dtype: float32
- name: '1651'
dtype: float32
- name: '1652'
dtype: float32
- name: '1653'
dtype: float32
- name: '1654'
dtype: float32
- name: '1655'
dtype: float32
- name: '1656'
dtype: float32
- name: '1657'
dtype: float32
- name: '1658'
dtype: float32
- name: '1659'
dtype: float32
- name: '1660'
dtype: float32
- name: '1661'
dtype: float32
- name: '1662'
dtype: float32
- name: '1663'
dtype: float32
- name: '1664'
dtype: float32
- name: '1665'
dtype: float32
- name: '1666'
dtype: float32
- name: '1667'
dtype: float32
- name: '1668'
dtype: float32
- name: '1669'
dtype: float32
- name: '1670'
dtype: float32
- name: '1671'
dtype: float32
- name: '1672'
dtype: float32
- name: '1673'
dtype: float32
- name: '1674'
dtype: float32
- name: '1675'
dtype: float32
- name: '1676'
dtype: float32
- name: '1677'
dtype: float32
- name: '1678'
dtype: float32
- name: '1679'
dtype: float32
- name: '1680'
dtype: float32
- name: '1681'
dtype: float32
- name: '1682'
dtype: float32
- name: '1683'
dtype: float32
- name: '1684'
dtype: float32
- name: '1685'
dtype: float32
- name: '1686'
dtype: float32
- name: '1687'
dtype: float32
- name: '1688'
dtype: float32
- name: '1689'
dtype: float32
- name: '1690'
dtype: float32
- name: '1691'
dtype: float32
- name: '1692'
dtype: float32
- name: '1693'
dtype: float32
- name: '1694'
dtype: float32
- name: '1695'
dtype: float32
- name: '1696'
dtype: float32
- name: '1697'
dtype: float32
- name: '1698'
dtype: float32
- name: '1699'
dtype: float32
- name: '1700'
dtype: float32
- name: '1701'
dtype: float32
- name: '1702'
dtype: float32
- name: '1703'
dtype: float32
- name: '1704'
dtype: float32
- name: '1705'
dtype: float32
- name: '1706'
dtype: float32
- name: '1707'
dtype: float32
- name: '1708'
dtype: float32
- name: '1709'
dtype: float32
- name: '1710'
dtype: float32
- name: '1711'
dtype: float32
- name: '1712'
dtype: float32
- name: '1713'
dtype: float32
- name: '1714'
dtype: float32
- name: '1715'
dtype: float32
- name: '1716'
dtype: float32
- name: '1717'
dtype: float32
- name: '1718'
dtype: float32
- name: '1719'
dtype: float32
- name: '1720'
dtype: float32
- name: '1721'
dtype: float32
- name: '1722'
dtype: float32
- name: '1723'
dtype: float32
- name: '1724'
dtype: float32
- name: '1725'
dtype: float32
- name: '1726'
dtype: float32
- name: '1727'
dtype: float32
- name: '1728'
dtype: float32
- name: '1729'
dtype: float32
- name: '1730'
dtype: float32
- name: '1731'
dtype: float32
- name: '1732'
dtype: float32
- name: '1733'
dtype: float32
- name: '1734'
dtype: float32
- name: '1735'
dtype: float32
- name: '1736'
dtype: float32
- name: '1737'
dtype: float32
- name: '1738'
dtype: float32
- name: '1739'
dtype: float32
- name: '1740'
dtype: float32
- name: '1741'
dtype: float32
- name: '1742'
dtype: float32
- name: '1743'
dtype: float32
- name: '1744'
dtype: float32
- name: '1745'
dtype: float32
- name: '1746'
dtype: float32
- name: '1747'
dtype: float32
- name: '1748'
dtype: float32
- name: '1749'
dtype: float32
- name: '1750'
dtype: float32
- name: '1751'
dtype: float32
- name: '1752'
dtype: float32
- name: '1753'
dtype: float32
- name: '1754'
dtype: float32
- name: '1755'
dtype: float32
- name: '1756'
dtype: float32
- name: '1757'
dtype: float32
- name: '1758'
dtype: float32
- name: '1759'
dtype: float32
- name: '1760'
dtype: float32
- name: '1761'
dtype: float32
- name: '1762'
dtype: float32
- name: '1763'
dtype: float32
- name: '1764'
dtype: float32
- name: '1765'
dtype: float32
- name: '1766'
dtype: float32
- name: '1767'
dtype: float32
- name: '1768'
dtype: float32
- name: '1769'
dtype: float32
- name: '1770'
dtype: float32
- name: '1771'
dtype: float32
- name: '1772'
dtype: float32
- name: '1773'
dtype: float32
- name: '1774'
dtype: float32
- name: '1775'
dtype: float32
- name: '1776'
dtype: float32
- name: '1777'
dtype: float32
- name: '1778'
dtype: float32
- name: '1779'
dtype: float32
- name: '1780'
dtype: float32
- name: '1781'
dtype: float32
- name: '1782'
dtype: float32
- name: '1783'
dtype: float32
- name: '1784'
dtype: float32
- name: '1785'
dtype: float32
- name: '1786'
dtype: float32
- name: '1787'
dtype: float32
- name: '1788'
dtype: float32
- name: '1789'
dtype: float32
- name: '1790'
dtype: float32
- name: '1791'
dtype: float32
- name: '1792'
dtype: float32
- name: '1793'
dtype: float32
- name: '1794'
dtype: float32
- name: '1795'
dtype: float32
- name: '1796'
dtype: float32
- name: '1797'
dtype: float32
- name: '1798'
dtype: float32
- name: '1799'
dtype: float32
- name: '1800'
dtype: float32
- name: '1801'
dtype: float32
- name: '1802'
dtype: float32
- name: '1803'
dtype: float32
- name: '1804'
dtype: float32
- name: '1805'
dtype: float32
- name: '1806'
dtype: float32
- name: '1807'
dtype: float32
- name: '1808'
dtype: float32
- name: '1809'
dtype: float32
- name: '1810'
dtype: float32
- name: '1811'
dtype: float32
- name: '1812'
dtype: float32
- name: '1813'
dtype: float32
- name: '1814'
dtype: float32
- name: '1815'
dtype: float32
- name: '1816'
dtype: float32
- name: '1817'
dtype: float32
- name: '1818'
dtype: float32
- name: '1819'
dtype: float32
- name: '1820'
dtype: float32
- name: '1821'
dtype: float32
- name: '1822'
dtype: float32
- name: '1823'
dtype: float32
- name: '1824'
dtype: float32
- name: '1825'
dtype: float32
- name: '1826'
dtype: float32
- name: '1827'
dtype: float32
- name: '1828'
dtype: float32
- name: '1829'
dtype: float32
- name: '1830'
dtype: float32
- name: '1831'
dtype: float32
- name: '1832'
dtype: float32
- name: '1833'
dtype: float32
- name: '1834'
dtype: float32
- name: '1835'
dtype: float32
- name: '1836'
dtype: float32
- name: '1837'
dtype: float32
- name: '1838'
dtype: float32
- name: '1839'
dtype: float32
- name: '1840'
dtype: float32
- name: '1841'
dtype: float32
- name: '1842'
dtype: float32
- name: '1843'
dtype: float32
- name: '1844'
dtype: float32
- name: '1845'
dtype: float32
- name: '1846'
dtype: float32
- name: '1847'
dtype: float32
- name: '1848'
dtype: float32
- name: '1849'
dtype: float32
- name: '1850'
dtype: float32
- name: '1851'
dtype: float32
- name: '1852'
dtype: float32
- name: '1853'
dtype: float32
- name: '1854'
dtype: float32
- name: '1855'
dtype: float32
- name: '1856'
dtype: float32
- name: '1857'
dtype: float32
- name: '1858'
dtype: float32
- name: '1859'
dtype: float32
- name: '1860'
dtype: float32
- name: '1861'
dtype: float32
- name: '1862'
dtype: float32
- name: '1863'
dtype: float32
- name: '1864'
dtype: float32
- name: '1865'
dtype: float32
- name: '1866'
dtype: float32
- name: '1867'
dtype: float32
- name: '1868'
dtype: float32
- name: '1869'
dtype: float32
- name: '1870'
dtype: float32
- name: '1871'
dtype: float32
- name: '1872'
dtype: float32
- name: '1873'
dtype: float32
- name: '1874'
dtype: float32
- name: '1875'
dtype: float32
- name: '1876'
dtype: float32
- name: '1877'
dtype: float32
- name: '1878'
dtype: float32
- name: '1879'
dtype: float32
- name: '1880'
dtype: float32
- name: '1881'
dtype: float32
- name: '1882'
dtype: float32
- name: '1883'
dtype: float32
- name: '1884'
dtype: float32
- name: '1885'
dtype: float32
- name: '1886'
dtype: float32
- name: '1887'
dtype: float32
- name: '1888'
dtype: float32
- name: '1889'
dtype: float32
- name: '1890'
dtype: float32
- name: '1891'
dtype: float32
- name: '1892'
dtype: float32
- name: '1893'
dtype: float32
- name: '1894'
dtype: float32
- name: '1895'
dtype: float32
- name: '1896'
dtype: float32
- name: '1897'
dtype: float32
- name: '1898'
dtype: float32
- name: '1899'
dtype: float32
- name: '1900'
dtype: float32
- name: '1901'
dtype: float32
- name: '1902'
dtype: float32
- name: '1903'
dtype: float32
- name: '1904'
dtype: float32
- name: '1905'
dtype: float32
- name: '1906'
dtype: float32
- name: '1907'
dtype: float32
- name: '1908'
dtype: float32
- name: '1909'
dtype: float32
- name: '1910'
dtype: float32
- name: '1911'
dtype: float32
- name: '1912'
dtype: float32
- name: '1913'
dtype: float32
- name: '1914'
dtype: float32
- name: '1915'
dtype: float32
- name: '1916'
dtype: float32
- name: '1917'
dtype: float32
- name: '1918'
dtype: float32
- name: '1919'
dtype: float32
- name: '1920'
dtype: float32
- name: '1921'
dtype: float32
- name: '1922'
dtype: float32
- name: '1923'
dtype: float32
- name: '1924'
dtype: float32
- name: '1925'
dtype: float32
- name: '1926'
dtype: float32
- name: '1927'
dtype: float32
- name: '1928'
dtype: float32
- name: '1929'
dtype: float32
- name: '1930'
dtype: float32
- name: '1931'
dtype: float32
- name: '1932'
dtype: float32
- name: '1933'
dtype: float32
- name: '1934'
dtype: float32
- name: '1935'
dtype: float32
- name: '1936'
dtype: float32
- name: '1937'
dtype: float32
- name: '1938'
dtype: float32
- name: '1939'
dtype: float32
- name: '1940'
dtype: float32
- name: '1941'
dtype: float32
- name: '1942'
dtype: float32
- name: '1943'
dtype: float32
- name: '1944'
dtype: float32
- name: '1945'
dtype: float32
- name: '1946'
dtype: float32
- name: '1947'
dtype: float32
- name: '1948'
dtype: float32
- name: '1949'
dtype: float32
- name: '1950'
dtype: float32
- name: '1951'
dtype: float32
- name: '1952'
dtype: float32
- name: '1953'
dtype: float32
- name: '1954'
dtype: float32
- name: '1955'
dtype: float32
- name: '1956'
dtype: float32
- name: '1957'
dtype: float32
- name: '1958'
dtype: float32
- name: '1959'
dtype: float32
- name: '1960'
dtype: float32
- name: '1961'
dtype: float32
- name: '1962'
dtype: float32
- name: '1963'
dtype: float32
- name: '1964'
dtype: float32
- name: '1965'
dtype: float32
- name: '1966'
dtype: float32
- name: '1967'
dtype: float32
- name: '1968'
dtype: float32
- name: '1969'
dtype: float32
- name: '1970'
dtype: float32
- name: '1971'
dtype: float32
- name: '1972'
dtype: float32
- name: '1973'
dtype: float32
- name: '1974'
dtype: float32
- name: '1975'
dtype: float32
- name: '1976'
dtype: float32
- name: '1977'
dtype: float32
- name: '1978'
dtype: float32
- name: '1979'
dtype: float32
- name: '1980'
dtype: float32
- name: '1981'
dtype: float32
- name: '1982'
dtype: float32
- name: '1983'
dtype: float32
- name: '1984'
dtype: float32
- name: '1985'
dtype: float32
- name: '1986'
dtype: float32
- name: '1987'
dtype: float32
- name: '1988'
dtype: float32
- name: '1989'
dtype: float32
- name: '1990'
dtype: float32
- name: '1991'
dtype: float32
- name: '1992'
dtype: float32
- name: '1993'
dtype: float32
- name: '1994'
dtype: float32
- name: '1995'
dtype: float32
- name: '1996'
dtype: float32
- name: '1997'
dtype: float32
- name: '1998'
dtype: float32
- name: '1999'
dtype: float32
- name: '2000'
dtype: float32
- name: '2001'
dtype: float32
- name: '2002'
dtype: float32
- name: '2003'
dtype: float32
- name: '2004'
dtype: float32
- name: '2005'
dtype: float32
- name: '2006'
dtype: float32
- name: '2007'
dtype: float32
- name: '2008'
dtype: float32
- name: '2009'
dtype: float32
- name: '2010'
dtype: float32
- name: '2011'
dtype: float32
- name: '2012'
dtype: float32
- name: '2013'
dtype: float32
- name: '2014'
dtype: float32
- name: '2015'
dtype: float32
- name: '2016'
dtype: float32
- name: '2017'
dtype: float32
- name: '2018'
dtype: float32
- name: '2019'
dtype: float32
- name: '2020'
dtype: float32
- name: '2021'
dtype: float32
- name: '2022'
dtype: float32
- name: '2023'
dtype: float32
- name: '2024'
dtype: float32
- name: '2025'
dtype: float32
- name: '2026'
dtype: float32
- name: '2027'
dtype: float32
- name: '2028'
dtype: float32
- name: '2029'
dtype: float32
- name: '2030'
dtype: float32
- name: '2031'
dtype: float32
- name: '2032'
dtype: float32
- name: '2033'
dtype: float32
- name: '2034'
dtype: float32
- name: '2035'
dtype: float32
- name: '2036'
dtype: float32
- name: '2037'
dtype: float32
- name: '2038'
dtype: float32
- name: '2039'
dtype: float32
- name: '2040'
dtype: float32
- name: '2041'
dtype: float32
- name: '2042'
dtype: float32
- name: '2043'
dtype: float32
- name: '2044'
dtype: float32
- name: '2045'
dtype: float32
- name: '2046'
dtype: float32
- name: '2047'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 307608907.5
num_examples: 37500
- name: test
num_bytes: 102536305.0
num_examples: 12500
download_size: 565388883
dataset_size: 410145212.5
---
# Dataset Card for "PKDD_GPTNEO_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tr416/dataset_20231006_231926 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 73865
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231006_231926"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/7d3eb966 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 188
num_examples: 10
download_size: 1337
dataset_size: 188
---
# Dataset Card for "7d3eb966"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_kingbri__chronolima-airo-grad-l2-13B | ---
pretty_name: Evaluation run of kingbri/chronolima-airo-grad-l2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kingbri/chronolima-airo-grad-l2-13B](https://huggingface.co/kingbri/chronolima-airo-grad-l2-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kingbri__chronolima-airo-grad-l2-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-18T00:13:00.101023](https://huggingface.co/datasets/open-llm-leaderboard/details_kingbri__chronolima-airo-grad-l2-13B/blob/main/results_2023-09-18T00-13-00.101023.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.018770973154362415,\n\
\ \"em_stderr\": 0.0013898509848031188,\n \"f1\": 0.08510381711409391,\n\
\ \"f1_stderr\": 0.001943348962771241,\n \"acc\": 0.4478082161451867,\n\
\ \"acc_stderr\": 0.010806174983049747\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.018770973154362415,\n \"em_stderr\": 0.0013898509848031188,\n\
\ \"f1\": 0.08510381711409391,\n \"f1_stderr\": 0.001943348962771241\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13949962092494314,\n \
\ \"acc_stderr\": 0.0095434266871913\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.7561168113654302,\n \"acc_stderr\": 0.012068923278908194\n\
\ }\n}\n```"
repo_url: https://huggingface.co/kingbri/chronolima-airo-grad-l2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|arc:challenge|25_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T00_13_00.101023
path:
- '**/details_harness|drop|3_2023-09-18T00-13-00.101023.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T00-13-00.101023.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T00_13_00.101023
path:
- '**/details_harness|gsm8k|5_2023-09-18T00-13-00.101023.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T00-13-00.101023.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hellaswag|10_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:57:29.540366.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T11:57:29.540366.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T11:57:29.540366.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T00_13_00.101023
path:
- '**/details_harness|winogrande|5_2023-09-18T00-13-00.101023.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T00-13-00.101023.parquet'
- config_name: results
data_files:
- split: 2023_08_09T11_57_29.540366
path:
- results_2023-08-09T11:57:29.540366.parquet
- split: 2023_09_18T00_13_00.101023
path:
- results_2023-09-18T00-13-00.101023.parquet
- split: latest
path:
- results_2023-09-18T00-13-00.101023.parquet
---
# Dataset Card for Evaluation run of kingbri/chronolima-airo-grad-l2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/kingbri/chronolima-airo-grad-l2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [kingbri/chronolima-airo-grad-l2-13B](https://huggingface.co/kingbri/chronolima-airo-grad-l2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kingbri__chronolima-airo-grad-l2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T00:13:00.101023](https://huggingface.co/datasets/open-llm-leaderboard/details_kingbri__chronolima-airo-grad-l2-13B/blob/main/results_2023-09-18T00-13-00.101023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.018770973154362415,
"em_stderr": 0.0013898509848031188,
"f1": 0.08510381711409391,
"f1_stderr": 0.001943348962771241,
"acc": 0.4478082161451867,
"acc_stderr": 0.010806174983049747
},
"harness|drop|3": {
"em": 0.018770973154362415,
"em_stderr": 0.0013898509848031188,
"f1": 0.08510381711409391,
"f1_stderr": 0.001943348962771241
},
"harness|gsm8k|5": {
"acc": 0.13949962092494314,
"acc_stderr": 0.0095434266871913
},
"harness|winogrande|5": {
"acc": 0.7561168113654302,
"acc_stderr": 0.012068923278908194
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
heliosprime/twitter_dataset_1713153594 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 5898
num_examples: 15
download_size: 10492
dataset_size: 5898
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713153594"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ponkotuzamurai/conoha_qa | ---
license: apache-2.0
---
|
arieg/cluster05_large_10 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '003271'
'1': 003492
'2': 003911
'3': '004037'
'4': 005158
'5': 006779
'6': 007709
'7': 010810
'8': 012489
'9': '013540'
'10': 016821
'11': 019073
'12': 019417
'13': '020704'
'14': 021409
'15': 022348
'16': 026859
'17': 027987
'18': 029747
'19': 029816
'20': 031392
'21': '032332'
'22': 032800
'23': '034003'
'24': '042463'
'25': '043767'
'26': 045518
'27': 046930
'28': 049029
'29': 052508
'30': 059659
'31': 062180
'32': 063208
'33': 064809
'34': '067017'
'35': '074375'
'36': '074671'
'37': 075866
'38': 084055
'39': 085491
'40': 089485
'41': 091938
'42': 092292
'43': 092538
'44': 094033
'45': 095310
'46': 095724
'47': 095725
'48': 095727
'49': 096726
'50': 096944
'51': '103520'
'52': '105713'
'53': '105912'
'54': '106339'
'55': '106568'
'56': '107389'
'57': '107588'
'58': '107852'
'59': '108299'
'60': '108301'
'61': '108307'
'62': '108308'
'63': '108970'
'64': '109447'
'65': '109448'
'66': '109896'
'67': '109901'
'68': '109906'
'69': '110436'
'70': '110437'
'71': '110438'
'72': '110439'
'73': '110441'
'74': '112976'
'75': '112977'
'76': '112978'
'77': '113259'
'78': '113276'
'79': '113281'
'80': '114371'
'81': '115591'
'82': '116029'
'83': '116456'
'84': '116883'
'85': '118496'
'86': '120322'
'87': '121318'
'88': '122352'
'89': '122357'
'90': '122365'
'91': '122621'
'92': '122626'
'93': '122631'
'94': '124180'
'95': '125193'
'96': '126241'
'97': '126747'
'98': '126748'
'99': '126778'
'100': '127189'
'101': '127289'
'102': '127331'
'103': '127520'
'104': '129683'
'105': '130953'
'106': '131985'
'107': '132454'
'108': '132455'
'109': '132793'
'110': '133100'
'111': '133788'
'112': '133977'
'113': '134084'
'114': '135228'
'115': '135369'
'116': '135370'
'117': '138015'
'118': '138319'
'119': '138414'
'120': '139521'
'121': '145458'
'122': '145551'
'123': '146961'
'124': '146970'
'125': '148082'
'126': '148233'
'127': '148429'
'128': '149118'
'129': '149139'
'130': '150267'
'131': '153452'
splits:
- name: train
num_bytes: 72900678.84
num_examples: 1320
download_size: 72621719
dataset_size: 72900678.84
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/ayanami_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ayanami/綾波/绫波 (Azur Lane)
This is the dataset of ayanami/綾波/绫波 (Azur Lane), containing 500 images and their tags.
The core tags of this character are `long_hair, ponytail, bangs, headgear, hair_between_eyes, sidelocks, hair_ornament, breasts, blonde_hair, orange_eyes, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 772.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ayanami_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 402.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ayanami_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1305 | 906.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ayanami_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 666.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ayanami_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1305 | 1.30 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ayanami_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ayanami_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, solo, bare_shoulders, detached_sleeves, holding_sword, china_dress, hair_ribbon, red_ribbon, thighhighs, cleavage, simple_background, elbow_gloves, hairclip, light_brown_hair, long_sleeves, side_slit, smile, very_long_hair, black_gloves, fingerless_gloves, greatsword, pelvic_curtain, white_background, wide_sleeves |
| 1 | 33 |  |  |  |  |  | 1girl, blue_skirt, detached_sleeves, looking_at_viewer, midriff, pleated_skirt, serafuku, solo, wide_sleeves, retrofit_(azur_lane), white_thighhighs, belt, navel, zettai_ryouiki, blue_sailor_collar, long_sleeves, simple_background, collarbone, white_background, bare_shoulders, crop_top, blush, ascot, white_shirt, hairclip, high_ponytail, black_choker, medium_breasts, miniskirt, sitting, closed_mouth, stomach |
| 2 | 9 |  |  |  |  |  | 1girl, blue_sailor_collar, looking_at_viewer, serafuku, sleeveless_shirt, solo, white_shirt, bare_shoulders, crop_top, midriff, simple_background, white_background, navel, bandaged_leg, bare_arms, blue_skirt, blush, pleated_skirt, yellow_ascot, anchor_symbol, closed_mouth, collarbone, cowboy_shot, high_ponytail |
| 3 | 6 |  |  |  |  |  | 1girl, belt, cloudy_sky, detached_sleeves, holding_sword, machinery, midriff, navel, ocean, pleated_skirt, retrofit_(azur_lane), serafuku, solo, torpedo_tubes, white_thighhighs, wide_sleeves, looking_at_viewer, turret, zettai_ryouiki, bare_shoulders, blue_skirt, choker, crop_top, underboob, hairclip, high_ponytail, outdoors, rigging, sailor_collar, shirt, standing |
| 4 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, serafuku, solo, blue_shirt, short_sleeves, blue_skirt, blush, pleated_skirt, white_background, alternate_costume, hairclip, simple_background, white_sailor_collar, wrist_cuffs, yellow_bowtie, frills, parted_lips, very_long_hair, high_ponytail, medium_breasts, smile, white_thighhighs |
| 5 | 10 |  |  |  |  |  | 1girl, bare_shoulders, collarbone, looking_at_viewer, navel, solo, wide_sleeves, blush, midriff, off_shoulder, choker, long_sleeves, white_thighhighs, uchikake, cleavage, detached_sleeves, stomach |
| 6 | 16 |  |  |  |  |  | 1girl, looking_at_viewer, solo, hair_bow, mask_on_head, wide_sleeves, bare_shoulders, fox_mask, blush, long_sleeves, pleated_skirt, white_thighhighs, collarbone, black_skirt, smile, detached_collar, hairclip, zettai_ryouiki, cleavage, sleeves_past_wrists, fur_trim, kimono |
| 7 | 7 |  |  |  |  |  | 1girl, black_pantyhose, black_shorts, looking_at_viewer, midriff, navel, short_shorts, solo, bare_shoulders, headphones, legwear_under_shorts, retrofit_(azur_lane), suspenders, blush, high_ponytail, off_shoulder, black_coat, black_jacket, brown_pantyhose, medium_breasts, open_coat, parted_lips, sitting, stomach, very_long_hair |
| 8 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, bare_shoulders, collarbone, medium_breasts, outdoors, blue_sky, cloudy_sky, day, navel, white_bikini, cleavage, holding, wet, hairclip, ocean, water |
| 9 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, collarbone, short_sleeves, bare_shoulders, pillow, white_shirt, white_thighhighs, headphones, clothes_writing, off-shoulder_shirt, sitting, bandaid, oversized_clothes, white_background, window |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | bare_shoulders | detached_sleeves | holding_sword | china_dress | hair_ribbon | red_ribbon | thighhighs | cleavage | simple_background | elbow_gloves | hairclip | light_brown_hair | long_sleeves | side_slit | smile | very_long_hair | black_gloves | fingerless_gloves | greatsword | pelvic_curtain | white_background | wide_sleeves | blue_skirt | midriff | pleated_skirt | serafuku | retrofit_(azur_lane) | white_thighhighs | belt | navel | zettai_ryouiki | blue_sailor_collar | collarbone | crop_top | blush | ascot | white_shirt | high_ponytail | black_choker | medium_breasts | miniskirt | sitting | closed_mouth | stomach | sleeveless_shirt | bandaged_leg | bare_arms | yellow_ascot | anchor_symbol | cowboy_shot | cloudy_sky | machinery | ocean | torpedo_tubes | turret | choker | underboob | outdoors | rigging | sailor_collar | shirt | standing | blue_shirt | short_sleeves | alternate_costume | white_sailor_collar | wrist_cuffs | yellow_bowtie | frills | parted_lips | off_shoulder | uchikake | hair_bow | mask_on_head | fox_mask | black_skirt | detached_collar | sleeves_past_wrists | fur_trim | kimono | black_pantyhose | black_shorts | short_shorts | headphones | legwear_under_shorts | suspenders | black_coat | black_jacket | brown_pantyhose | open_coat | blue_sky | day | white_bikini | holding | wet | water | pillow | clothes_writing | off-shoulder_shirt | bandaid | oversized_clothes | window |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------------|:-------------------|:----------------|:--------------|:--------------|:-------------|:-------------|:-----------|:--------------------|:---------------|:-----------|:-------------------|:---------------|:------------|:--------|:-----------------|:---------------|:--------------------|:-------------|:-----------------|:-------------------|:---------------|:-------------|:----------|:----------------|:-----------|:-----------------------|:-------------------|:-------|:--------|:-----------------|:---------------------|:-------------|:-----------|:--------|:--------|:--------------|:----------------|:---------------|:-----------------|:------------|:----------|:---------------|:----------|:-------------------|:---------------|:------------|:---------------|:----------------|:--------------|:-------------|:------------|:--------|:----------------|:---------|:---------|:------------|:-----------|:----------|:----------------|:--------|:-----------|:-------------|:----------------|:--------------------|:----------------------|:--------------|:----------------|:---------|:--------------|:---------------|:-----------|:-----------|:---------------|:-----------|:--------------|:------------------|:----------------------|:-----------|:---------|:------------------|:---------------|:---------------|:-------------|:-----------------------|:-------------|:-------------|:---------------|:------------------|:------------|:-----------|:------|:---------------|:----------|:------|:--------|:---------|:------------------|:---------------------|:----------|:--------------------|:---------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 33 |  |  |  |  |  | X | X | X | X | X | | | | | | | X | | X | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | X | | | | | | | | X | | | | | | | | | | | | X | | X | X | X | X | | | | X | | X | X | X | X | | X | X | | | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | X | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | X | X | | | | | | | | | X | | X | | | | X | X | | | | | X | | X | | X | X | | X | | | | | | | X | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 10 |  |  |  |  |  | X | X | X | X | X | | | | | | X | | | | | X | | | | | | | | | X | | X | | | | X | | X | | | X | | X | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 16 |  |  |  |  |  | X | X | X | X | | | | | | | X | | | X | | X | | X | | | | | | | X | | | X | | | X | | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | X | X | X | | | | | | | | | | | | | | | X | | | | | | | | X | | | X | | | X | | | | | X | | | X | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | X | X | X | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | X | | | X | | X | | | | | X | | | | | | | | | | | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | |
| 9 | 8 |  |  |  |  |  | X | X | X | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | X | | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X |
|
xz56/openwebtext-tokenized-llama-256-1.5B | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 5881094452.0
num_examples: 5720909
- name: test
num_bytes: 309531828.0
num_examples: 301101
download_size: 3050956963
dataset_size: 6190626280.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Atipico1/nq-test-adv_passage | ---
dataset_info:
features:
- name: question
dtype: string
- name: entity
dtype: string
- name: similar_entity
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: masked_query
dtype: string
- name: original_case
list:
- name: answer
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: unans_case
list:
- name: answer
dtype: string
- name: answers
sequence: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: conflict_case
list:
- name: answer
dtype: string
- name: conflict_context
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: context
dtype: string
- name: context_vague
dtype: string
- name: entities
dtype: string
- name: entities_count
dtype: int64
- name: adv_sent
dtype: string
- name: adv_passage
dtype: string
splits:
- name: train
num_bytes: 58397824
num_examples: 3610
download_size: 34346195
dataset_size: 58397824
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
deepsynthbody/deepfake-ecg2 | ---
license: mit
---
|
yzhuang/autotree_automl_Higgs_gosdt_l512_d3_sd2 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 12501600000
num_examples: 100000
- name: validation
num_bytes: 1250160000
num_examples: 10000
download_size: 9801930446
dataset_size: 13751760000
---
# Dataset Card for "autotree_automl_Higgs_gosdt_l512_d3_sd2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1712991992 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2320896
num_examples: 7105
download_size: 1318496
dataset_size: 2320896
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_KoboldAI__GPT-NeoX-20B-Erebus | ---
pretty_name: Evaluation run of KoboldAI/GPT-NeoX-20B-Erebus
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KoboldAI/GPT-NeoX-20B-Erebus](https://huggingface.co/KoboldAI/GPT-NeoX-20B-Erebus)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KoboldAI__GPT-NeoX-20B-Erebus\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T16:29:58.049517](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__GPT-NeoX-20B-Erebus/blob/main/results_2023-10-24T16-29-58.049517.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n\
\ \"em_stderr\": 0.0003144653119413213,\n \"f1\": 0.050781250000000264,\n\
\ \"f1_stderr\": 0.0012129008741175679,\n \"acc\": 0.3519405232133358,\n\
\ \"acc_stderr\": 0.00860227452891923\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.0003144653119413213,\n\
\ \"f1\": 0.050781250000000264,\n \"f1_stderr\": 0.0012129008741175679\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.022744503411675512,\n \
\ \"acc_stderr\": 0.004106620637749689\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.681136543014996,\n \"acc_stderr\": 0.013097928420088771\n\
\ }\n}\n```"
repo_url: https://huggingface.co/KoboldAI/GPT-NeoX-20B-Erebus
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T16_29_58.049517
path:
- '**/details_harness|drop|3_2023-10-24T16-29-58.049517.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T16-29-58.049517.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T16_29_58.049517
path:
- '**/details_harness|gsm8k|5_2023-10-24T16-29-58.049517.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T16-29-58.049517.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:38:23.585493.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:38:23.585493.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:38:23.585493.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T16_29_58.049517
path:
- '**/details_harness|winogrande|5_2023-10-24T16-29-58.049517.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T16-29-58.049517.parquet'
- config_name: results
data_files:
- split: 2023_07_19T21_38_23.585493
path:
- results_2023-07-19T21:38:23.585493.parquet
- split: 2023_10_24T16_29_58.049517
path:
- results_2023-10-24T16-29-58.049517.parquet
- split: latest
path:
- results_2023-10-24T16-29-58.049517.parquet
---
# Dataset Card for Evaluation run of KoboldAI/GPT-NeoX-20B-Erebus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KoboldAI/GPT-NeoX-20B-Erebus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KoboldAI/GPT-NeoX-20B-Erebus](https://huggingface.co/KoboldAI/GPT-NeoX-20B-Erebus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KoboldAI__GPT-NeoX-20B-Erebus",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T16:29:58.049517](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__GPT-NeoX-20B-Erebus/blob/main/results_2023-10-24T16-29-58.049517.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413213,
"f1": 0.050781250000000264,
"f1_stderr": 0.0012129008741175679,
"acc": 0.3519405232133358,
"acc_stderr": 0.00860227452891923
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413213,
"f1": 0.050781250000000264,
"f1_stderr": 0.0012129008741175679
},
"harness|gsm8k|5": {
"acc": 0.022744503411675512,
"acc_stderr": 0.004106620637749689
},
"harness|winogrande|5": {
"acc": 0.681136543014996,
"acc_stderr": 0.013097928420088771
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
umarigan/textbook_is_all_you_need | ---
dataset_info:
features:
- name: page_content
dtype: string
- name: metadata
struct:
- name: page
dtype: int64
- name: source
dtype: string
splits:
- name: train
num_bytes: 36605083
num_examples: 16869
download_size: 22765390
dataset_size: 36605083
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
language:
- Turkish
pretty_name: Text data from fictions
This dataset created from 30+ Turkish classic fictions
|
stanmalkinson199/StanMarshClassic | ---
license: openrail
---
|
amaye15/Stack-Overflow-Zero-Shot-Classification | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: Title
dtype: string
- name: Tags
dtype: string
- name: Predicted_Tags
dtype: string
- name: Predicted_Tag_Scores
sequence: float64
splits:
- name: train
num_bytes: 27853258
num_examples: 111030
download_size: 16579853
dataset_size: 27853258
---
# Dataset Card for "Stack-Overflow-Zero-Shot-Classification"
# Automatic Stack Overflow Question Classifier
## Important
All credit goes to huggingface user [MoritzLaurer](https://huggingface.co/MoritzLaurer) as his model is the basis for this project.
## Introduction
The Automatic Stack Overflow Question Classifier harnesses the latest advancements in artificial intelligence to systematically categorize questions on Stack Overflow. Its primary goal is to streamline the process of sorting queries, enhancing navigability, and improving the overall user experience on the platform.
## About the Project
This initiative takes advantage of the DeBERTa V3 model's capabilities in zero-shot classification. By doing so, it aims to revolutionize how questions are organized on Stack Overflow. Instead of relying on manual categorization, which can be time-consuming and inconsistent, this project introduces an automated, AI-driven approach for more precise and efficient question sorting.
## Code and Repository
Access the complete source code and project details on GitHub: [Stack Overflow Question Classifier Repository](https://github.com/amaye15/stackoverflow-question-classifier).
## Streamlit App
Access our live classifier [here](https://stack-overflow-question-classifier.streamlit.app/). This interactive web application demonstrates the model's capabilities in real-time.
## Model
Learn more about the DeBERTa V3 model and its adaptation for this project on Hugging Face: [DeBERTa V3 on Hugging Face](https://huggingface.co/amaye15/Stack-Overflow-Zero-Shot-Classification).
## Dataset
The dataset, curated specifically for this project, can be found here: [Stack Overflow Zero-Shot Classification Dataset](https://huggingface.co/datasets/amaye15/Stack-Overflow-Zero-Shot-Classification). It encompasses a wide range of Stack Overflow questions, providing a comprehensive base for model training and testing.
|
joey234/mmlu-prehistory-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 88639
num_examples: 324
download_size: 53960
dataset_size: 88639
---
# Dataset Card for "mmlu-prehistory-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_01-ai__Yi-34B-Chat | ---
pretty_name: Evaluation run of 01-ai/Yi-34B-Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [01-ai/Yi-34B-Chat](https://huggingface.co/01-ai/Yi-34B-Chat) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_01-ai__Yi-34B-Chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-05T03:47:25.491369](https://huggingface.co/datasets/open-llm-leaderboard/details_01-ai__Yi-34B-Chat/blob/main/results_2023-12-05T03-47-25.491369.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7393930299846158,\n\
\ \"acc_stderr\": 0.028807135333088364,\n \"acc_norm\": 0.7489434623723922,\n\
\ \"acc_norm_stderr\": 0.02935457295982731,\n \"mc1\": 0.3843329253365973,\n\
\ \"mc1_stderr\": 0.017028707301245203,\n \"mc2\": 0.5536831362008046,\n\
\ \"mc2_stderr\": 0.015524186394858242\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6373720136518771,\n \"acc_stderr\": 0.014049106564955012,\n\
\ \"acc_norm\": 0.6544368600682594,\n \"acc_norm_stderr\": 0.013896938461145678\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6536546504680343,\n\
\ \"acc_stderr\": 0.004748324319714274,\n \"acc_norm\": 0.8415654252141008,\n\
\ \"acc_norm_stderr\": 0.003644017383711605\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.7111111111111111,\n \"acc_stderr\": 0.03915450630414251,\n\
\ \"acc_norm\": 0.7111111111111111,\n \"acc_norm_stderr\": 0.03915450630414251\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8552631578947368,\n\
\ \"acc_stderr\": 0.028631951845930387,\n \"acc_norm\": 0.8552631578947368,\n\
\ \"acc_norm_stderr\": 0.028631951845930387\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\":\
\ 0.7924528301886793,\n \"acc_stderr\": 0.02495991802891127,\n \"\
acc_norm\": 0.7924528301886793,\n \"acc_norm_stderr\": 0.02495991802891127\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n\
\ \"acc_stderr\": 0.030085743248565666,\n \"acc_norm\": 0.8472222222222222,\n\
\ \"acc_norm_stderr\": 0.030085743248565666\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n\
\ \"acc_stderr\": 0.034961014811911786,\n \"acc_norm\": 0.6994219653179191,\n\
\ \"acc_norm_stderr\": 0.034961014811911786\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n\
\ \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7659574468085106,\n \"acc_stderr\": 0.02767845257821239,\n\
\ \"acc_norm\": 0.7659574468085106,\n \"acc_norm_stderr\": 0.02767845257821239\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5526315789473685,\n\
\ \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.5526315789473685,\n\
\ \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.0333333333333333,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.0333333333333333\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6349206349206349,\n\
\ \"acc_stderr\": 0.024796060602699965,\n \"acc_norm\": 0.6349206349206349,\n\
\ \"acc_norm_stderr\": 0.024796060602699965\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.5396825396825397,\n \"acc_stderr\": 0.04458029125470973,\n\
\ \"acc_norm\": 0.5396825396825397,\n \"acc_norm_stderr\": 0.04458029125470973\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.8709677419354839,\n \"acc_stderr\": 0.019070889254792767,\n\
\ \"acc_norm\": 0.8709677419354839,\n \"acc_norm_stderr\": 0.019070889254792767\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6206896551724138,\n \"acc_stderr\": 0.03413963805906235,\n \"\
acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.03413963805906235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\"\
: 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.898989898989899,\n \"acc_stderr\": 0.021469735576055343,\n \"\
acc_norm\": 0.898989898989899,\n \"acc_norm_stderr\": 0.021469735576055343\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9533678756476683,\n \"acc_stderr\": 0.015216761819262585,\n\
\ \"acc_norm\": 0.9533678756476683,\n \"acc_norm_stderr\": 0.015216761819262585\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7846153846153846,\n \"acc_stderr\": 0.020843034557462878,\n\
\ \"acc_norm\": 0.7846153846153846,\n \"acc_norm_stderr\": 0.020843034557462878\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857403,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857403\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8361344537815126,\n \"acc_stderr\": 0.024044054940440488,\n\
\ \"acc_norm\": 0.8361344537815126,\n \"acc_norm_stderr\": 0.024044054940440488\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5033112582781457,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.5033112582781457,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.908256880733945,\n \"acc_stderr\": 0.012376323409137123,\n \"\
acc_norm\": 0.908256880733945,\n \"acc_norm_stderr\": 0.012376323409137123\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6342592592592593,\n \"acc_stderr\": 0.03284738857647206,\n \"\
acc_norm\": 0.6342592592592593,\n \"acc_norm_stderr\": 0.03284738857647206\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9068627450980392,\n \"acc_stderr\": 0.020397853969426998,\n \"\
acc_norm\": 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969426998\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640255,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640255\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n\
\ \"acc_stderr\": 0.026241132996407256,\n \"acc_norm\": 0.8116591928251121,\n\
\ \"acc_norm_stderr\": 0.026241132996407256\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8931297709923665,\n \"acc_stderr\": 0.027096548624883733,\n\
\ \"acc_norm\": 0.8931297709923665,\n \"acc_norm_stderr\": 0.027096548624883733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540627,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540627\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.852760736196319,\n \"acc_stderr\": 0.027839915278339657,\n\
\ \"acc_norm\": 0.852760736196319,\n \"acc_norm_stderr\": 0.027839915278339657\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n\
\ \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n\
\ \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.0328818027880863,\n\
\ \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.0328818027880863\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.01831589168562586,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.01831589168562586\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8991060025542784,\n\
\ \"acc_stderr\": 0.01077047201488672,\n \"acc_norm\": 0.8991060025542784,\n\
\ \"acc_norm_stderr\": 0.01077047201488672\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.021152676966575277,\n\
\ \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.021152676966575277\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7016759776536313,\n\
\ \"acc_stderr\": 0.01530184004512928,\n \"acc_norm\": 0.7016759776536313,\n\
\ \"acc_norm_stderr\": 0.01530184004512928\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8300653594771242,\n \"acc_stderr\": 0.02150538312123137,\n\
\ \"acc_norm\": 0.8300653594771242,\n \"acc_norm_stderr\": 0.02150538312123137\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8102893890675241,\n\
\ \"acc_stderr\": 0.02226819625878322,\n \"acc_norm\": 0.8102893890675241,\n\
\ \"acc_norm_stderr\": 0.02226819625878322\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.018689725721062072,\n\
\ \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.018689725721062072\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6170212765957447,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5495436766623207,\n\
\ \"acc_stderr\": 0.012707390438502348,\n \"acc_norm\": 0.5495436766623207,\n\
\ \"acc_norm_stderr\": 0.012707390438502348\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7830882352941176,\n \"acc_stderr\": 0.025035845227711274,\n\
\ \"acc_norm\": 0.7830882352941176,\n \"acc_norm_stderr\": 0.025035845227711274\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.826797385620915,\n \"acc_stderr\": 0.015309329266969138,\n \
\ \"acc_norm\": 0.826797385620915,\n \"acc_norm_stderr\": 0.015309329266969138\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.02366169917709861,\n\
\ \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.02366169917709861\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3843329253365973,\n\
\ \"mc1_stderr\": 0.017028707301245203,\n \"mc2\": 0.5536831362008046,\n\
\ \"mc2_stderr\": 0.015524186394858242\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8011049723756906,\n \"acc_stderr\": 0.01121862997251531\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3191811978771797,\n \
\ \"acc_stderr\": 0.012840345676251648\n }\n}\n```"
repo_url: https://huggingface.co/01-ai/Yi-34B-Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|arc:challenge|25_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|arc:challenge|25_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|drop|3_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|gsm8k|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|gsm8k|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hellaswag|10_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hellaswag|10_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T03-47-25.491369.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-05T03-47-25.491369.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|winogrande|5_2023-11-26T08-55-32.839765.parquet'
- split: 2023_12_05T03_47_25.491369
path:
- '**/details_harness|winogrande|5_2023-12-05T03-47-25.491369.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-05T03-47-25.491369.parquet'
- config_name: results
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- results_2023-11-26T08-55-32.839765.parquet
- split: 2023_12_05T03_47_25.491369
path:
- results_2023-12-05T03-47-25.491369.parquet
- split: latest
path:
- results_2023-12-05T03-47-25.491369.parquet
---
# Dataset Card for Evaluation run of 01-ai/Yi-34B-Chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/01-ai/Yi-34B-Chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [01-ai/Yi-34B-Chat](https://huggingface.co/01-ai/Yi-34B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_01-ai__Yi-34B-Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-05T03:47:25.491369](https://huggingface.co/datasets/open-llm-leaderboard/details_01-ai__Yi-34B-Chat/blob/main/results_2023-12-05T03-47-25.491369.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7393930299846158,
"acc_stderr": 0.028807135333088364,
"acc_norm": 0.7489434623723922,
"acc_norm_stderr": 0.02935457295982731,
"mc1": 0.3843329253365973,
"mc1_stderr": 0.017028707301245203,
"mc2": 0.5536831362008046,
"mc2_stderr": 0.015524186394858242
},
"harness|arc:challenge|25": {
"acc": 0.6373720136518771,
"acc_stderr": 0.014049106564955012,
"acc_norm": 0.6544368600682594,
"acc_norm_stderr": 0.013896938461145678
},
"harness|hellaswag|10": {
"acc": 0.6536546504680343,
"acc_stderr": 0.004748324319714274,
"acc_norm": 0.8415654252141008,
"acc_norm_stderr": 0.003644017383711605
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8552631578947368,
"acc_stderr": 0.028631951845930387,
"acc_norm": 0.8552631578947368,
"acc_norm_stderr": 0.028631951845930387
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7924528301886793,
"acc_stderr": 0.02495991802891127,
"acc_norm": 0.7924528301886793,
"acc_norm_stderr": 0.02495991802891127
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565666,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.034961014811911786,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.034961014811911786
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7659574468085106,
"acc_stderr": 0.02767845257821239,
"acc_norm": 0.7659574468085106,
"acc_norm_stderr": 0.02767845257821239
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8,
"acc_stderr": 0.0333333333333333,
"acc_norm": 0.8,
"acc_norm_stderr": 0.0333333333333333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6349206349206349,
"acc_stderr": 0.024796060602699965,
"acc_norm": 0.6349206349206349,
"acc_norm_stderr": 0.024796060602699965
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8709677419354839,
"acc_stderr": 0.019070889254792767,
"acc_norm": 0.8709677419354839,
"acc_norm_stderr": 0.019070889254792767
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.03413963805906235,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.03413963805906235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066573,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066573
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.898989898989899,
"acc_stderr": 0.021469735576055343,
"acc_norm": 0.898989898989899,
"acc_norm_stderr": 0.021469735576055343
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9533678756476683,
"acc_stderr": 0.015216761819262585,
"acc_norm": 0.9533678756476683,
"acc_norm_stderr": 0.015216761819262585
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7846153846153846,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.7846153846153846,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857403,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857403
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8361344537815126,
"acc_stderr": 0.024044054940440488,
"acc_norm": 0.8361344537815126,
"acc_norm_stderr": 0.024044054940440488
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5033112582781457,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.5033112582781457,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.908256880733945,
"acc_stderr": 0.012376323409137123,
"acc_norm": 0.908256880733945,
"acc_norm_stderr": 0.012376323409137123
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6342592592592593,
"acc_stderr": 0.03284738857647206,
"acc_norm": 0.6342592592592593,
"acc_norm_stderr": 0.03284738857647206
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969426998,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969426998
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640255,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640255
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407256,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407256
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8931297709923665,
"acc_stderr": 0.027096548624883733,
"acc_norm": 0.8931297709923665,
"acc_norm_stderr": 0.027096548624883733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540627,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540627
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.852760736196319,
"acc_stderr": 0.027839915278339657,
"acc_norm": 0.852760736196319,
"acc_norm_stderr": 0.027839915278339657
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.0328818027880863,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.0328818027880863
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.01831589168562586,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.01831589168562586
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8991060025542784,
"acc_stderr": 0.01077047201488672,
"acc_norm": 0.8991060025542784,
"acc_norm_stderr": 0.01077047201488672
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.021152676966575277,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.021152676966575277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7016759776536313,
"acc_stderr": 0.01530184004512928,
"acc_norm": 0.7016759776536313,
"acc_norm_stderr": 0.01530184004512928
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8300653594771242,
"acc_stderr": 0.02150538312123137,
"acc_norm": 0.8300653594771242,
"acc_norm_stderr": 0.02150538312123137
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8102893890675241,
"acc_stderr": 0.02226819625878322,
"acc_norm": 0.8102893890675241,
"acc_norm_stderr": 0.02226819625878322
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.018689725721062072,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.018689725721062072
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5495436766623207,
"acc_stderr": 0.012707390438502348,
"acc_norm": 0.5495436766623207,
"acc_norm_stderr": 0.012707390438502348
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7830882352941176,
"acc_stderr": 0.025035845227711274,
"acc_norm": 0.7830882352941176,
"acc_norm_stderr": 0.025035845227711274
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.826797385620915,
"acc_stderr": 0.015309329266969138,
"acc_norm": 0.826797385620915,
"acc_norm_stderr": 0.015309329266969138
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8367346938775511,
"acc_stderr": 0.02366169917709861,
"acc_norm": 0.8367346938775511,
"acc_norm_stderr": 0.02366169917709861
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3843329253365973,
"mc1_stderr": 0.017028707301245203,
"mc2": 0.5536831362008046,
"mc2_stderr": 0.015524186394858242
},
"harness|winogrande|5": {
"acc": 0.8011049723756906,
"acc_stderr": 0.01121862997251531
},
"harness|gsm8k|5": {
"acc": 0.3191811978771797,
"acc_stderr": 0.012840345676251648
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
winddude/IHOP_flan_CoT | ---
license: apache-2.0
size_categories:
- 10K<n<100K
---
# IHOP FLAN CoT ZeroShot
A recreation of the FLAN cot dataset for only zeroshot.
Because: the flan framework is nightmare to work with. And the "conceptofmind" repos are missing the raw targets for eval as well as incorrectly labeling "opt"
I also add double line breaks between every "thought"/"step" in the response, to make it easier to eval with something like PRM outlined in "Let’s Verify Step by Step"
The templates are very similar and close enough that it shold not matter.
## Citations
```
@article{longpre2023flan,
title={The Flan Collection: Designing Data and Methods for Effective Instruction Tuning},
author={Longpre, Shayne and Hou, Le and Vu, Tu and Webson, Albert and Chung, Hyung Won and Tay, Yi and Zhou, Denny and Le, Quoc V and Zoph, Barret and Wei, Jason and others},
journal={arXiv preprint arXiv:2301.13688},
year={2023}
}
@article{lightman2023lets,
title={Let's Verify Step by Step},
author={Lightman, Hunter and Kosaraju, Vineet and Burda, Yura and Edwards, Harri and Baker, Bowen and Lee, Teddy and Leike, Jan and Schulman, John and Sutskever, Ilya and Cobbe, Karl},
journal={arXiv preprint arXiv:2305.20050},
year={2023}
}
```
|
open-llm-leaderboard/details_BreadAi__MusePy-1-2 | ---
pretty_name: Evaluation run of BreadAi/MusePy-1-2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BreadAi/MusePy-1-2](https://huggingface.co/BreadAi/MusePy-1-2) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BreadAi__MusePy-1-2\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-03T17:04:48.338074](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__MusePy-1-2/blob/main/results_2023-12-03T17-04-48.338074.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"\
acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \
\ \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/BreadAi/MusePy-1-2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T07_17_07.226410
path:
- '**/details_harness|drop|3_2023-10-25T07-17-07.226410.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T07-17-07.226410.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T07_17_07.226410
path:
- '**/details_harness|gsm8k|5_2023-10-25T07-17-07.226410.parquet'
- split: 2023_12_03T17_04_48.338074
path:
- '**/details_harness|gsm8k|5_2023-12-03T17-04-48.338074.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-03T17-04-48.338074.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:39:08.820966.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:39:08.820966.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:39:08.820966.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T07_17_07.226410
path:
- '**/details_harness|winogrande|5_2023-10-25T07-17-07.226410.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T07-17-07.226410.parquet'
- config_name: results
data_files:
- split: 2023_07_19T19_39_08.820966
path:
- results_2023-07-19T19:39:08.820966.parquet
- split: 2023_10_25T07_17_07.226410
path:
- results_2023-10-25T07-17-07.226410.parquet
- split: 2023_12_03T17_04_48.338074
path:
- results_2023-12-03T17-04-48.338074.parquet
- split: latest
path:
- results_2023-12-03T17-04-48.338074.parquet
---
# Dataset Card for Evaluation run of BreadAi/MusePy-1-2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/BreadAi/MusePy-1-2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [BreadAi/MusePy-1-2](https://huggingface.co/BreadAi/MusePy-1-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BreadAi__MusePy-1-2",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T17:04:48.338074](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__MusePy-1-2/blob/main/results_2023-12-03T17-04-48.338074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
evaluate/glue-ci | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- acceptability-classification
- natural-language-inference
- semantic-similarity-scoring
- sentiment-classification
- text-classification-other-coreference-nli
- text-classification-other-paraphrase-identification
- text-classification-other-qa-nli
- text-scoring
paperswithcode_id: glue
pretty_name: GLUE (General Language Understanding Evaluation benchmark)
train-eval-index:
- config: cola
task: text-classification
task_id: binary_classification
splits:
train_split: train
eval_split: validation
col_mapping:
sentence: text
label: target
- config: sst2
task: text-classification
task_id: binary_classification
splits:
train_split: train
eval_split: validation
col_mapping:
sentence: text
label: target
- config: mrpc
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
sentence1: text1
sentence2: text2
label: target
- config: qqp
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
question1: text1
question2: text2
label: target
- config: stsb
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
sentence1: text1
sentence2: text2
label: target
- config: mnli
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation_matched
col_mapping:
premise: text1
hypothesis: text2
label: target
- config: mnli_mismatched
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
premise: text1
hypothesis: text2
label: target
- config: mnli_matched
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
premise: text1
hypothesis: text2
label: target
- config: qnli
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
question: text1
sentence: text2
label: target
- config: rte
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
sentence1: text1
sentence2: text2
label: target
- config: wnli
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
sentence1: text1
sentence2: text2
label: target
configs:
- ax
- cola
- mnli
- mnli_matched
- mnli_mismatched
- mrpc
- qnli
- qqp
- rte
- sst2
- stsb
- wnli
---
# Dataset Card for GLUE
## Table of Contents
- [Dataset Card for GLUE](#dataset-card-for-glue)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [ax](#ax)
- [cola](#cola)
- [mnli](#mnli)
- [mnli_matched](#mnli_matched)
- [mnli_mismatched](#mnli_mismatched)
- [mrpc](#mrpc)
- [qnli](#qnli)
- [qqp](#qqp)
- [rte](#rte)
- [sst2](#sst2)
- [stsb](#stsb)
- [wnli](#wnli)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [ax](#ax-1)
- [cola](#cola-1)
- [mnli](#mnli-1)
- [mnli_matched](#mnli_matched-1)
- [mnli_mismatched](#mnli_mismatched-1)
- [mrpc](#mrpc-1)
- [qnli](#qnli-1)
- [qqp](#qqp-1)
- [rte](#rte-1)
- [sst2](#sst2-1)
- [stsb](#stsb-1)
- [wnli](#wnli-1)
- [Data Fields](#data-fields)
- [ax](#ax-2)
- [cola](#cola-2)
- [mnli](#mnli-2)
- [mnli_matched](#mnli_matched-2)
- [mnli_mismatched](#mnli_mismatched-2)
- [mrpc](#mrpc-2)
- [qnli](#qnli-2)
- [qqp](#qqp-2)
- [rte](#rte-2)
- [sst2](#sst2-2)
- [stsb](#stsb-2)
- [wnli](#wnli-2)
- [Data Splits](#data-splits)
- [ax](#ax-3)
- [cola](#cola-3)
- [mnli](#mnli-3)
- [mnli_matched](#mnli_matched-3)
- [mnli_mismatched](#mnli_mismatched-3)
- [mrpc](#mrpc-3)
- [qnli](#qnli-3)
- [qqp](#qqp-3)
- [rte](#rte-3)
- [sst2](#sst2-3)
- [stsb](#stsb-3)
- [wnli](#wnli-3)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://nyu-mll.github.io/CoLA/](https://nyu-mll.github.io/CoLA/)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 955.33 MB
- **Size of the generated dataset:** 229.68 MB
- **Total amount of disk used:** 1185.01 MB
### Dataset Summary
GLUE, the General Language Understanding Evaluation benchmark (https://gluebenchmark.com/) is a collection of resources for training, evaluating, and analyzing natural language understanding systems.
### Supported Tasks and Leaderboards
The leaderboard for the GLUE benchmark can be found [at this address](https://gluebenchmark.com/). It comprises the following tasks:
#### ax
A manually-curated evaluation dataset for fine-grained analysis of system performance on a broad range of linguistic phenomena. This dataset evaluates sentence understanding through Natural Language Inference (NLI) problems. Use a model trained on MulitNLI to produce predictions for this dataset.
#### cola
The Corpus of Linguistic Acceptability consists of English acceptability judgments drawn from books and journal articles on linguistic theory. Each example is a sequence of words annotated with whether it is a grammatical English sentence.
#### mnli
The Multi-Genre Natural Language Inference Corpus is a crowdsourced collection of sentence pairs with textual entailment annotations. Given a premise sentence and a hypothesis sentence, the task is to predict whether the premise entails the hypothesis (entailment), contradicts the hypothesis (contradiction), or neither (neutral). The premise sentences are gathered from ten different sources, including transcribed speech, fiction, and government reports. The authors of the benchmark use the standard test set, for which they obtained private labels from the RTE authors, and evaluate on both the matched (in-domain) and mismatched (cross-domain) section. They also uses and recommend the SNLI corpus as 550k examples of auxiliary training data.
#### mnli_matched
The matched validation and test splits from MNLI. See the "mnli" BuilderConfig for additional information.
#### mnli_mismatched
The mismatched validation and test splits from MNLI. See the "mnli" BuilderConfig for additional information.
#### mrpc
The Microsoft Research Paraphrase Corpus (Dolan & Brockett, 2005) is a corpus of sentence pairs automatically extracted from online news sources, with human annotations for whether the sentences in the pair are semantically equivalent.
#### qnli
The Stanford Question Answering Dataset is a question-answering dataset consisting of question-paragraph pairs, where one of the sentences in the paragraph (drawn from Wikipedia) contains the answer to the corresponding question (written by an annotator). The authors of the benchmark convert the task into sentence pair classification by forming a pair between each question and each sentence in the corresponding context, and filtering out pairs with low lexical overlap between the question and the context sentence. The task is to determine whether the context sentence contains the answer to the question. This modified version of the original task removes the requirement that the model select the exact answer, but also removes the simplifying assumptions that the answer is always present in the input and that lexical overlap is a reliable cue.
#### qqp
The Quora Question Pairs2 dataset is a collection of question pairs from the community question-answering website Quora. The task is to determine whether a pair of questions are semantically equivalent.
#### rte
The Recognizing Textual Entailment (RTE) datasets come from a series of annual textual entailment challenges. The authors of the benchmark combined the data from RTE1 (Dagan et al., 2006), RTE2 (Bar Haim et al., 2006), RTE3 (Giampiccolo et al., 2007), and RTE5 (Bentivogli et al., 2009). Examples are constructed based on news and Wikipedia text. The authors of the benchmark convert all datasets to a two-class split, where for three-class datasets they collapse neutral and contradiction into not entailment, for consistency.
#### sst2
The Stanford Sentiment Treebank consists of sentences from movie reviews and human annotations of their sentiment. The task is to predict the sentiment of a given sentence. It uses the two-way (positive/negative) class split, with only sentence-level labels.
#### stsb
The Semantic Textual Similarity Benchmark (Cer et al., 2017) is a collection of sentence pairs drawn from news headlines, video and image captions, and natural language inference data. Each pair is human-annotated with a similarity score from 1 to 5.
#### wnli
The Winograd Schema Challenge (Levesque et al., 2011) is a reading comprehension task in which a system must read a sentence with a pronoun and select the referent of that pronoun from a list of choices. The examples are manually constructed to foil simple statistical methods: Each one is contingent on contextual information provided by a single word or phrase in the sentence. To convert the problem into sentence pair classification, the authors of the benchmark construct sentence pairs by replacing the ambiguous pronoun with each possible referent. The task is to predict if the sentence with the pronoun substituted is entailed by the original sentence. They use a small evaluation set consisting of new examples derived from fiction books that was shared privately by the authors of the original corpus. While the included training set is balanced between two classes, the test set is imbalanced between them (65% not entailment). Also, due to a data quirk, the development set is adversarial: hypotheses are sometimes shared between training and development examples, so if a model memorizes the training examples, they will predict the wrong label on corresponding development set example. As with QNLI, each example is evaluated separately, so there is not a systematic correspondence between a model's score on this task and its score on the unconverted original task. The authors of the benchmark call converted dataset WNLI (Winograd NLI).
### Languages
The language data in GLUE is in English (BCP-47 `en`)
## Dataset Structure
### Data Instances
#### ax
- **Size of downloaded dataset files:** 0.21 MB
- **Size of the generated dataset:** 0.23 MB
- **Total amount of disk used:** 0.44 MB
An example of 'test' looks as follows.
```
{
"premise": "The cat sat on the mat.",
"hypothesis": "The cat did not sit on the mat.",
"label": -1,
"idx: 0
}
```
#### cola
- **Size of downloaded dataset files:** 0.36 MB
- **Size of the generated dataset:** 0.58 MB
- **Total amount of disk used:** 0.94 MB
An example of 'train' looks as follows.
```
{
"sentence": "Our friends won't buy this analysis, let alone the next one we propose.",
"label": 1,
"id": 0
}
```
#### mnli
- **Size of downloaded dataset files:** 298.29 MB
- **Size of the generated dataset:** 78.65 MB
- **Total amount of disk used:** 376.95 MB
An example of 'train' looks as follows.
```
{
"premise": "Conceptually cream skimming has two basic dimensions - product and geography.",
"hypothesis": "Product and geography are what make cream skimming work.",
"label": 1,
"idx": 0
}
```
#### mnli_matched
- **Size of downloaded dataset files:** 298.29 MB
- **Size of the generated dataset:** 3.52 MB
- **Total amount of disk used:** 301.82 MB
An example of 'test' looks as follows.
```
{
"premise": "Hierbas, ans seco, ans dulce, and frigola are just a few names worth keeping a look-out for.",
"hypothesis": "Hierbas is a name worth looking out for.",
"label": -1,
"idx": 0
}
```
#### mnli_mismatched
- **Size of downloaded dataset files:** 298.29 MB
- **Size of the generated dataset:** 3.73 MB
- **Total amount of disk used:** 302.02 MB
An example of 'test' looks as follows.
```
{
"premise": "What have you decided, what are you going to do?",
"hypothesis": "So what's your decision?,
"label": -1,
"idx": 0
}
```
#### mrpc
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qqp
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### rte
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### sst2
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### stsb
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### wnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Data Fields
The data fields are the same among all splits.
#### ax
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
- `idx`: a `int32` feature.
#### cola
- `sentence`: a `string` feature.
- `label`: a classification label, with possible values including `unacceptable` (0), `acceptable` (1).
- `idx`: a `int32` feature.
#### mnli
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
- `idx`: a `int32` feature.
#### mnli_matched
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
- `idx`: a `int32` feature.
#### mnli_mismatched
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
- `idx`: a `int32` feature.
#### mrpc
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qqp
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### rte
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### sst2
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### stsb
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### wnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Data Splits
#### ax
| |test|
|---|---:|
|ax |1104|
#### cola
| |train|validation|test|
|----|----:|---------:|---:|
|cola| 8551| 1043|1063|
#### mnli
| |train |validation_matched|validation_mismatched|test_matched|test_mismatched|
|----|-----:|-----------------:|--------------------:|-----------:|--------------:|
|mnli|392702| 9815| 9832| 9796| 9847|
#### mnli_matched
| |validation|test|
|------------|---------:|---:|
|mnli_matched| 9815|9796|
#### mnli_mismatched
| |validation|test|
|---------------|---------:|---:|
|mnli_mismatched| 9832|9847|
#### mrpc
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qqp
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### rte
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### sst2
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### stsb
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### wnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{warstadt2018neural,
title={Neural Network Acceptability Judgments},
author={Warstadt, Alex and Singh, Amanpreet and Bowman, Samuel R},
journal={arXiv preprint arXiv:1805.12471},
year={2018}
}
@inproceedings{wang2019glue,
title={{GLUE}: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding},
author={Wang, Alex and Singh, Amanpreet and Michael, Julian and Hill, Felix and Levy, Omer and Bowman, Samuel R.},
note={In the Proceedings of ICLR.},
year={2019}
}
Note that each GLUE dataset has its own citation. Please see the source to see
the correct citation for each contained dataset.
```
### Contributions
Thanks to [@patpizio](https://github.com/patpizio), [@jeswan](https://github.com/jeswan), [@thomwolf](https://github.com/thomwolf), [@patrickvonplaten](https://github.com/patrickvonplaten), [@mariamabarham](https://github.com/mariamabarham) for adding this dataset.
|
anjalyjayakrishnan/test | ---
pretty_name: 'Snow Mountain'
language:
- hi
- bgc
- kfs
- dgo
- bhd
- gbk
- xnr
- kfx
- mjl
- kfo
- bfz
annotations_creators:
- ?
language_creators:
- ?
license: []
multilinguality:
- multilingual
size_categories:
-
source_datasets:
- Snow Mountain
tags: []
task_categories:
- automatic-speech-recognition
task_ids: []
configs:
- hi
- bgc
dataset_info:
- config_name: hi
features:
- name: Unnamed
dtype: int64
- name: sentence
dtype: string
- name: path
dtype: string
splits:
- name: train_500
num_examples: 400
- name: val_500
num_examples: 100
- name: train_1000
num_examples: 800
- name: val_1000
num_examples: 200
- name: test_common
num_examples: 500
dataset_size: 71.41 hrs
- config_name: bgc
features:
- name: Unnamed
dtype: int64
- name: sentence
dtype: string
- name: path
dtype: string
splits:
- name: train_500
num_examples: 400
- name: val_500
num_examples: 100
- name: train_1000
num_examples: 800
- name: val_1000
num_examples: 200
- name: test_common
num_examples: 500
dataset_size: 27.41 hrs
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:https://gitlabdev.bridgeconn.com/software/research/datasets/snow-mountain**
- **Paper:https://arxiv.org/abs/2206.01205**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The Snow Mountain dataset contains the audio recordings (in .mp3 format) and the corresponding text of The Bible in 11 Indian languages. The recordings were done in a studio setting by native speakers. Each language has a single speaker in the dataset. Most of these languages are geographically concentrated in the Northern part of India around the state of Himachal Pradesh. Being related to Hindi they all use the Devanagari script for transcription.
We have used this dataset for experiments in ASR tasks. But these could be used for other applications in speech domain, like speaker recognition, language identification or even as unlabelled corpus for pre-training.
### Supported Tasks and Leaderboards
Atomatic speech recognition, Speaker recognition, Language identification
### Languages
Hindi, Haryanvi, Bilaspuri, Dogri, Bhadrawahi, Gaddi, Kangri, Kulvi, Mandeali, Kulvi Outer Seraji, Pahari Mahasui
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
The Bible recordings were done in a studio setting by native speakers.
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
The data is licensed under the Creative Commons Attribution-ShareAlike 4.0 International Public License (CC BY-SA 4.0)
### Citation Information
@inproceedings{Raju2022SnowMD,
title={Snow Mountain: Dataset of Audio Recordings of The Bible in Low Resource Languages},
author={Kavitha Raju and V. Anjaly and R. Allen Lish and Joel Mathew},
year={2022}
}
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset.
|
Back-up/chung-khoan-demo-14 | ---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
- name: view
struct:
- name: number_of_response
dtype: string
- name: number_of_view
dtype: string
- name: content
list:
- name: res
dtype: string
splits:
- name: train
num_bytes: 165251824
num_examples: 34667
download_size: 58929562
dataset_size: 165251824
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
aryamannningombam/indian-tts | ---
dataset_info:
features:
- name: text
dtype: string
- name: vec
sequence: float64
splits:
- name: train
num_bytes: 25418236030
num_examples: 34781
download_size: 19095913523
dataset_size: 25418236030
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.