datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
avsolatorio/mteb-amazon_reviews_multi-avs_triplets | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: label
dtype: int32
- name: label_text
dtype: string
- name: idx
dtype: int64
- name: query_idx
dtype: int64
- name: positive_idx
dtype: int64
- name: negative_idx
dtype: int64
splits:
- name: train
num_bytes: 52924802
num_examples: 200000
download_size: 33545313
dataset_size: 52924802
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# MTEB Amazon Reviews Multi Dataset
This dataset was used in the paper GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning. Refer to https://arxiv.org/abs/2402.16829 for details.
The code for generating the data is available at https://github.com/avsolatorio/GISTEmbed/blob/main/scripts/create_classification_dataset.py.
## Citation
```
@article{solatorio2024gistembed,
title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
author={Aivin V. Solatorio},
journal={arXiv preprint arXiv:2402.16829},
year={2024},
URL={https://arxiv.org/abs/2402.16829}
eprint={2402.16829},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
``` |
Miaosen/CCCleanerDataset8G | ---
task_categories:
- text-classification
language:
- en
--- |
qazisaad/llama_2_optimized_product_titles-esci-test-sft | ---
dataset_info:
features:
- name: index
dtype: int64
- name: product_title
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4827482
num_examples: 11924
download_size: 2588134
dataset_size: 4827482
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2_optimized_product_titles-esci-test-sft"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tmon546596046/processed_bert_dataset | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 67615200.0
num_examples: 18782
download_size: 16390157
dataset_size: 67615200.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "processed_bert_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tr4n5b0y4l34t0r10/gabi_cattuzo9B | ---
license: openrail
---
|
joey234/mmlu-sociology-rule-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 66976
num_examples: 201
download_size: 43493
dataset_size: 66976
---
# Dataset Card for "mmlu-sociology-rule-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1 | ---
pretty_name: Evaluation run of ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1](https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-15T02:49:27.291692](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1/blob/main/results_2024-01-15T02-49-27.291692.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6032096253875614,\n\
\ \"acc_stderr\": 0.03321637816759657,\n \"acc_norm\": 0.6097201219482176,\n\
\ \"acc_norm_stderr\": 0.033909808173675136,\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627904,\n \"mc2\": 0.40550458795616723,\n\
\ \"mc2_stderr\": 0.015282277248005289\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5614334470989761,\n \"acc_stderr\": 0.014500682618212865,\n\
\ \"acc_norm\": 0.6023890784982935,\n \"acc_norm_stderr\": 0.01430175222327954\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6253734315873332,\n\
\ \"acc_stderr\": 0.00483037131784105,\n \"acc_norm\": 0.8228440549691296,\n\
\ \"acc_norm_stderr\": 0.003810203308901103\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.029067220146644826,\n\
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.029067220146644826\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.04113914981189261,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.04113914981189261\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n\
\ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n\
\ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964684,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964684\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860677,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860677\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.0249393139069408,\n \
\ \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.0249393139069408\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154343,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7254901960784313,\n \"acc_stderr\": 0.03132179803083291,\n \"\
acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.03132179803083291\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543674,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543674\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n\
\ \"acc_stderr\": 0.01471168438613996,\n \"acc_norm\": 0.7841634738186463,\n\
\ \"acc_norm_stderr\": 0.01471168438613996\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388676996,\n\
\ \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388676996\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3776536312849162,\n\
\ \"acc_stderr\": 0.016214148752136632,\n \"acc_norm\": 0.3776536312849162,\n\
\ \"acc_norm_stderr\": 0.016214148752136632\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.02724561304721536,\n\
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.02724561304721536\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409828,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409828\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291467,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291467\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n\
\ \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.4211212516297262,\n\
\ \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n\
\ \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6127450980392157,\n \"acc_stderr\": 0.019706875804085637,\n \
\ \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.019706875804085637\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.02768691358801301,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.02768691358801301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533214,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533214\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627904,\n \"mc2\": 0.40550458795616723,\n\
\ \"mc2_stderr\": 0.015282277248005289\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025405\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2896133434420015,\n \
\ \"acc_stderr\": 0.012493927348659629\n }\n}\n```"
repo_url: https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|arc:challenge|25_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|gsm8k|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hellaswag|10_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T02-49-27.291692.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T02-49-27.291692.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- '**/details_harness|winogrande|5_2024-01-15T02-49-27.291692.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-15T02-49-27.291692.parquet'
- config_name: results
data_files:
- split: 2024_01_15T02_49_27.291692
path:
- results_2024-01-15T02-49-27.291692.parquet
- split: latest
path:
- results_2024-01-15T02-49-27.291692.parquet
---
# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1](https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T02:49:27.291692](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1/blob/main/results_2024-01-15T02-49-27.291692.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6032096253875614,
"acc_stderr": 0.03321637816759657,
"acc_norm": 0.6097201219482176,
"acc_norm_stderr": 0.033909808173675136,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627904,
"mc2": 0.40550458795616723,
"mc2_stderr": 0.015282277248005289
},
"harness|arc:challenge|25": {
"acc": 0.5614334470989761,
"acc_stderr": 0.014500682618212865,
"acc_norm": 0.6023890784982935,
"acc_norm_stderr": 0.01430175222327954
},
"harness|hellaswag|10": {
"acc": 0.6253734315873332,
"acc_stderr": 0.00483037131784105,
"acc_norm": 0.8228440549691296,
"acc_norm_stderr": 0.003810203308901103
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.029067220146644826,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.029067220146644826
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.04113914981189261,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.04113914981189261
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.025167982333894143,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.025167982333894143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964684,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964684
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860677,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860677
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.0249393139069408,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.0249393139069408
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524572,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154343,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.03132179803083291,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.03132179803083291
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543674,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543674
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.01471168438613996,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.01471168438613996
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388676996,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388676996
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3776536312849162,
"acc_stderr": 0.016214148752136632,
"acc_norm": 0.3776536312849162,
"acc_norm_stderr": 0.016214148752136632
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.02724561304721536,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.02724561304721536
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409828,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409828
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291467,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291467
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4211212516297262,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.4211212516297262,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.019706875804085637,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.019706875804085637
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801301,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653693,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653693
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533214,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533214
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627904,
"mc2": 0.40550458795616723,
"mc2_stderr": 0.015282277248005289
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.011807360224025405
},
"harness|gsm8k|5": {
"acc": 0.2896133434420015,
"acc_stderr": 0.012493927348659629
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
distilled-from-one-sec-cv12/chunk_243 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1032589192
num_examples: 201206
download_size: 1054505769
dataset_size: 1032589192
---
# Dataset Card for "chunk_243"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ardneebwar/medmcqa-and-race | ---
license: apache-2.0
---
### Dataset: MEDMCQA-and-RACE
This dataset is a refined combination of two well-known datasets, MedMCQA and RACE, tailored specifically for the development and evaluation of machine learning models in the context of multiple-choice question (MCQ) generation. Originally encompassing over 200,000 entries, this dataset has been meticulously cleaned and condensed to around 6,000 high-quality rows.
Each entry in the dataset comprises a context (textual content) followed by a corresponding question, a set of multiple-choice answers, and the correct answer. This structure makes the dataset an ideal resource for training and testing NLP models aimed at generating MCQs along with their answers based on provided contexts.
### Content
- context: This column contains the textual content or paragraph based on which questions and multiple-choice answers are formulated.
- question: This column contains the questions created from the context.
- options: This section contains a set of four multiple-choice answers (A, B, C, D) related to the question.
- correct_answer: This column indicates the correct answer from the options provided.
### Use Cases
This dataset is suitable for a range of applications, including but not limited to:
- Training models to generate MCQs from textual content.
- Evaluating comprehension and information retrieval capabilities of NLP models.
- Assisting educators and content creators in generating educational materials and assessment questions. |
AdiOO7/Ticket_Classification | ---
task_categories:
- text-classification
language:
- en
tags:
- code
pretty_name: Ticket-Categorization
size_categories:
- n<1K
--- |
NoaCA14/my_data | ---
dataset_info:
features:
- name: text
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 77357.07
num_examples: 63
- name: test
num_bytes: 36836.7
num_examples: 30
- name: validation
num_bytes: 8595.23
num_examples: 7
download_size: 110591
dataset_size: 122789.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_JunchengXie__Mistral-7B-v0.1-raw-80k | ---
pretty_name: Evaluation run of JunchengXie/Mistral-7B-v0.1-raw-80k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JunchengXie/Mistral-7B-v0.1-raw-80k](https://huggingface.co/JunchengXie/Mistral-7B-v0.1-raw-80k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JunchengXie__Mistral-7B-v0.1-raw-80k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T18:53:50.873338](https://huggingface.co/datasets/open-llm-leaderboard/details_JunchengXie__Mistral-7B-v0.1-raw-80k/blob/main/results_2024-03-27T18-53-50.873338.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6335396313921574,\n\
\ \"acc_stderr\": 0.032428002273992416,\n \"acc_norm\": 0.639690830589762,\n\
\ \"acc_norm_stderr\": 0.033084835477046035,\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.43020404044306754,\n\
\ \"mc2_stderr\": 0.014222202423343218\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5767918088737202,\n \"acc_stderr\": 0.014438036220848029,\n\
\ \"acc_norm\": 0.6151877133105802,\n \"acc_norm_stderr\": 0.014218371065251104\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6320454092810197,\n\
\ \"acc_stderr\": 0.004812633280078267,\n \"acc_norm\": 0.8356901015733917,\n\
\ \"acc_norm_stderr\": 0.0036979923561244795\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.0437588849272706,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.0437588849272706\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7516129032258064,\n \"acc_stderr\": 0.024580028921481003,\n \"\
acc_norm\": 0.7516129032258064,\n \"acc_norm_stderr\": 0.024580028921481003\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203648,\n \"\
acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203648\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597514,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597514\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.013890862162876168,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.013890862162876168\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069706,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069706\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34301675977653634,\n\
\ \"acc_stderr\": 0.01587691267305774,\n \"acc_norm\": 0.34301675977653634,\n\
\ \"acc_norm_stderr\": 0.01587691267305774\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4348109517601043,\n\
\ \"acc_stderr\": 0.012661233805616305,\n \"acc_norm\": 0.4348109517601043,\n\
\ \"acc_norm_stderr\": 0.012661233805616305\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.43020404044306754,\n\
\ \"mc2_stderr\": 0.014222202423343218\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.011539912734345398\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36087945413191813,\n \
\ \"acc_stderr\": 0.013228626753925148\n }\n}\n```"
repo_url: https://huggingface.co/JunchengXie/Mistral-7B-v0.1-raw-80k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|arc:challenge|25_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|gsm8k|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hellaswag|10_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-53-50.873338.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T18-53-50.873338.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- '**/details_harness|winogrande|5_2024-03-27T18-53-50.873338.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T18-53-50.873338.parquet'
- config_name: results
data_files:
- split: 2024_03_27T18_53_50.873338
path:
- results_2024-03-27T18-53-50.873338.parquet
- split: latest
path:
- results_2024-03-27T18-53-50.873338.parquet
---
# Dataset Card for Evaluation run of JunchengXie/Mistral-7B-v0.1-raw-80k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JunchengXie/Mistral-7B-v0.1-raw-80k](https://huggingface.co/JunchengXie/Mistral-7B-v0.1-raw-80k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JunchengXie__Mistral-7B-v0.1-raw-80k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T18:53:50.873338](https://huggingface.co/datasets/open-llm-leaderboard/details_JunchengXie__Mistral-7B-v0.1-raw-80k/blob/main/results_2024-03-27T18-53-50.873338.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6335396313921574,
"acc_stderr": 0.032428002273992416,
"acc_norm": 0.639690830589762,
"acc_norm_stderr": 0.033084835477046035,
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.43020404044306754,
"mc2_stderr": 0.014222202423343218
},
"harness|arc:challenge|25": {
"acc": 0.5767918088737202,
"acc_stderr": 0.014438036220848029,
"acc_norm": 0.6151877133105802,
"acc_norm_stderr": 0.014218371065251104
},
"harness|hellaswag|10": {
"acc": 0.6320454092810197,
"acc_stderr": 0.004812633280078267,
"acc_norm": 0.8356901015733917,
"acc_norm_stderr": 0.0036979923561244795
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.0437588849272706,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.0437588849272706
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.016785481159203648,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.016785481159203648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597514,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597514
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876168,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876168
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069706,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069706
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34301675977653634,
"acc_stderr": 0.01587691267305774,
"acc_norm": 0.34301675977653634,
"acc_norm_stderr": 0.01587691267305774
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4348109517601043,
"acc_stderr": 0.012661233805616305,
"acc_norm": 0.4348109517601043,
"acc_norm_stderr": 0.012661233805616305
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.43020404044306754,
"mc2_stderr": 0.014222202423343218
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.011539912734345398
},
"harness|gsm8k|5": {
"acc": 0.36087945413191813,
"acc_stderr": 0.013228626753925148
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
crodri/autotrain-data-massive-4-catalan | ---
task_categories:
- text-classification
---
# AutoTrain Dataset for project: massive-4-catalan
## Dataset Description
This dataset has been automatically processed by AutoTrain for project massive-4-catalan.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"feat_id": "1",
"feat_locale": "ca-ES",
"feat_partition": "train",
"feat_scenario": 0,
"target": 2,
"text": "desperta'm a les nou a. m. del divendres",
"feat_annot_utt": "desperta'm a les [time : nou a. m.] del [date : divendres]",
"feat_worker_id": "42",
"feat_slot_method.slot": [
"time",
"date"
],
"feat_slot_method.method": [
"translation",
"translation"
],
"feat_judgments.worker_id": [
"42",
"30",
"3"
],
"feat_judgments.intent_score": [
1,
1,
1
],
"feat_judgments.slots_score": [
1,
1,
1
],
"feat_judgments.grammar_score": [
4,
3,
4
],
"feat_judgments.spelling_score": [
2,
2,
2
],
"feat_judgments.language_identification": [
"target",
"target|english",
"target"
]
},
{
"feat_id": "2",
"feat_locale": "ca-ES",
"feat_partition": "train",
"feat_scenario": 0,
"target": 2,
"text": "posa una alarma per d\u2019aqu\u00ed a dues hores",
"feat_annot_utt": "posa una alarma per [time : d\u2019aqu\u00ed a dues hores]",
"feat_worker_id": "15",
"feat_slot_method.slot": [
"time"
],
"feat_slot_method.method": [
"translation"
],
"feat_judgments.worker_id": [
"42",
"30",
"24"
],
"feat_judgments.intent_score": [
1,
1,
1
],
"feat_judgments.slots_score": [
1,
1,
1
],
"feat_judgments.grammar_score": [
4,
4,
4
],
"feat_judgments.spelling_score": [
2,
2,
2
],
"feat_judgments.language_identification": [
"target",
"target",
"target"
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"feat_id": "Value(dtype='string', id=None)",
"feat_locale": "Value(dtype='string', id=None)",
"feat_partition": "Value(dtype='string', id=None)",
"feat_scenario": "ClassLabel(num_classes=18, names=['alarm', 'audio', 'calendar', 'cooking', 'datetime', 'email', 'general', 'iot', 'lists', 'music', 'news', 'play', 'qa', 'recommendation', 'social', 'takeaway', 'transport', 'weather'], id=None)",
"target": "ClassLabel(num_classes=60, names=['alarm_query', 'alarm_remove', 'alarm_set', 'audio_volume_down', 'audio_volume_mute', 'audio_volume_other', 'audio_volume_up', 'calendar_query', 'calendar_remove', 'calendar_set', 'cooking_query', 'cooking_recipe', 'datetime_convert', 'datetime_query', 'email_addcontact', 'email_query', 'email_querycontact', 'email_sendemail', 'general_greet', 'general_joke', 'general_quirky', 'iot_cleaning', 'iot_coffee', 'iot_hue_lightchange', 'iot_hue_lightdim', 'iot_hue_lightoff', 'iot_hue_lighton', 'iot_hue_lightup', 'iot_wemo_off', 'iot_wemo_on', 'lists_createoradd', 'lists_query', 'lists_remove', 'music_dislikeness', 'music_likeness', 'music_query', 'music_settings', 'news_query', 'play_audiobook', 'play_game', 'play_music', 'play_podcasts', 'play_radio', 'qa_currency', 'qa_definition', 'qa_factoid', 'qa_maths', 'qa_stock', 'recommendation_events', 'recommendation_locations', 'recommendation_movies', 'social_post', 'social_query', 'takeaway_order', 'takeaway_query', 'transport_query', 'transport_taxi', 'transport_ticket', 'transport_traffic', 'weather_query'], id=None)",
"text": "Value(dtype='string', id=None)",
"feat_annot_utt": "Value(dtype='string', id=None)",
"feat_worker_id": "Value(dtype='string', id=None)",
"feat_slot_method.slot": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"feat_slot_method.method": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"feat_judgments.worker_id": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"feat_judgments.intent_score": "Sequence(feature=Value(dtype='int8', id=None), length=-1, id=None)",
"feat_judgments.slots_score": "Sequence(feature=Value(dtype='int8', id=None), length=-1, id=None)",
"feat_judgments.grammar_score": "Sequence(feature=Value(dtype='int8', id=None), length=-1, id=None)",
"feat_judgments.spelling_score": "Sequence(feature=Value(dtype='int8', id=None), length=-1, id=None)",
"feat_judgments.language_identification": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 11514 |
| valid | 2033 |
|
DBQ/Chloe.Product.prices.Hong.Kong | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Hong Kong - Chloe - Product-level price list
tags:
- webscraping
- ecommerce
- Chloe
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: string
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 675508
num_examples: 2431
download_size: 160521
dataset_size: 675508
---
# Chloe web scraped data
## About the website
The **Ecommerce industry** in the **Asia Pacific**, specifically in **Hong Kong** is highly dynamic and competitive. Digital technology advancements have greatly transformed the ways of doing business, with a particular momentum in the **online fashion industry**. Predominant players like **Chloe** are capitalizing on the regions growing internet populace, rising disposable incomes, and a cultural shift towards online shopping. The dataset observed includes **Ecommerce product-list page (PLP) data** on Chloe in Hong Kong. This data provides invaluable insights into consumer behavior, market trends, and the competitive landscape, enabling businesses to make informed and strategic decisions in the rapidly-evolving **digital commerce** skyline.
## Link to **dataset**
[Hong Kong - Chloe - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Chloe%20Product-prices%20Hong%20Kong/r/recWuG5WfvLdNOGCo)
|
duckaiml/Polylingual_Id | ---
license: other
---
# Polylingual Indonesia Dataset/Model Card
## Description
Polylingual Indonesia is a diverse dataset composed of a collection of publicly available data and some self-crawled data in the Indonesian language. The data originates from various sources including news articles, books, forums, and stories. This dataset is processed and filtered through specific techniques detailed in the sections below.
## Prerequisites
To run and utilize this dataset, make sure you have the `zstandard` package installed in your environment.
## Format
The original format of the dataset is JSONL with zstandard compressed
## Dataset Details
### Filtering
The dataset undergoes a filtration process using a specific filter from BigScience. Detailed information about the filter used can be found here: [BigScience Filter](https://drive.google.com/file/d/1cCJ8sWE88TRLDAa3eHLmXO4JlkR2QzLY/view?usp=sharing).
### Data Sources
The data originates from various sources, and each folder in the dataset represents the source from where the original data came. Here are the details of each folder/source:
#### 1. HPLT_filtered
- Link: [HPLT Project](https://hplt-project.org/datasets/v1)
- Source : Internet Archive Snapshot WIDE15 and WIDE17 and CC-MAIN-2022-40
#### 2. Mc4-und-id
- Data is filtered from the undefined language segment of MC4 (c4-und) using FastText, BigScience filters.
- Sample: [MC4 Sample](https://huggingface.co/datasets/allenai/c4/blob/mC4310/multilingual/c4-und.00000-00001-00002-00003-00004-00005-00006-00007.json.gz)
#### 3. Indonesia-Crawl
This folder contains a collection of the Common Crawl dataset and self crawled data, specific for the Indonesian language, accumulated from various snapshots. The data is divided into several sections:
- **Mc4 original(dedup)**: Ranging from snapshot CC-2013-20 to CC-2020-34.
- **Kopi-CC(dedup)**: This covers data from snapshot CC-2020-34 to CC-2023-06. More details can be found [here](https://huggingface.co/datasets/acul3/KoPI-CC) (Note: Last snapshot to be uploaded).
- **KoPI-CC_News**: This includes the Common Crawl News Dataset ranging from the year 2016 to 2022. Detailed information can be accessed [here](https://huggingface.co/datasets/acul3/KoPI-CC) (Note: Last snapshot to be uploaded).
- **Self Crawled Data**: consists of data crawled from various platforms including news sites, story sites, forums, and others.
## Usage
install zstandard first
```
from datasets import load_dataset
hplt = load_dataset('duckaiml/Polylingual_Id','hplt') #hplt only
mc4_und = load_dataset('duckaiml/Polylingual_Id','mc4_und') #mc4_und only
indonesia_crawl = load_dataset('duckaiml/Polylingual_Id','indonesia_crawl') #indonesia_crawl only
load_dataset('duckaiml/Polylingual_Id','full') #load all
```
## Limitation/Issue
- Although some source data already dedup separately, some duplicate might be found as dataset need dedup as whole dataset
## Contributing
Feel free to contribute to the dataset by adding more diverse sources or helping in enhancing the filtration process.
## License
The data is collected from public sources, and it's recommended to refer to the original data sources for information on licensing and usage restrictions.
## Contact
For any queries or contributions to the dataset, please feel free to reach out (contact information to be added).
|
thevox/en-nb-10k | ---
license: mpl-2.0
task_categories:
- translation
language:
- en
- nb
- 'no'
pretty_name: English-Norwegian Translation
size_categories:
- 10K<n<100K
---
## Update
See the larger dataset: [en-nb-15k](https://huggingface.co/datasets/thevox/en-nb-15k)
## Methadology
Used GPT-3.5 with a translation prompt to give two versions of translation in Norwegian from English - normal and "more natural". Context is also generated.
## Data
Input sample format:
```
Oversett til Norsk:
{text}
```
Each output sample is formatted like this:
```
Kontekst: {a, b, c}
Oversettelse:
{translation}
Mer naturlig:
{improved_translation}
```
## Future work
Dataset will be used to train LLM-based translation models based on LLama 2 and similar to try to rival DeepL and ChatGPT machine translation.
## Dataset used
Original english text from: [nampdn-ai/tiny-textbooks](https://huggingface.co/datasets/nampdn-ai/tiny-textbooks)
### Author Contaxt
jonaslsa@uio.no |
onkar627/MentaBot_Project | ---
license: mit
---
|
liuyanchen1015/MULTI_VALUE_wnli_bare_perfect | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 3099
num_examples: 15
- name: test
num_bytes: 17321
num_examples: 58
- name: train
num_bytes: 28088
num_examples: 137
download_size: 24245
dataset_size: 48508
---
# Dataset Card for "MULTI_VALUE_wnli_bare_perfect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/python-code-instructions-18k-alpaca-standardized_cluster_1_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 726346
num_examples: 3480
download_size: 273296
dataset_size: 726346
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python-code-instructions-18k-alpaca-standardized_cluster_1_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Seongill/nq_qa_set | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: masked_query
dtype: string
- name: query_embedding
sequence: float32
splits:
- name: train
num_bytes: 854033679
num_examples: 87925
download_size: 662313863
dataset_size: 854033679
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
matthewlqin/electronic_music | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 90443385.0
num_examples: 370
download_size: 90380943
dataset_size: 90443385.0
---
# Dataset Card for "electronic_music"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_cola_mass_noun_plurals | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 14684
num_examples: 181
- name: test
num_bytes: 15633
num_examples: 198
- name: train
num_bytes: 116248
num_examples: 1515
download_size: 74619
dataset_size: 146565
---
# Dataset Card for "MULTI_VALUE_cola_mass_noun_plurals"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ali-C137/TARJAMAT-UNPC-EN-ZH | ---
dataset_info:
- config_name: un-pc_ar-en
features:
- name: arabic
dtype: string
- name: english
dtype: string
- name: source
dtype: string
- name: metadata
dtype: string
splits:
- name: train
num_bytes: 8500696893
num_examples: 20044478
download_size: 3648057038
dataset_size: 8500696893
- config_name: un-pc_ar-zh
features:
- name: arabic
dtype: string
- name: chinese
dtype: string
- name: source
dtype: string
- name: metadata
dtype: 'null'
splits:
- name: train
num_bytes: 6707235000
num_examples: 17306056
download_size: 3058205349
dataset_size: 6707235000
configs:
- config_name: un-pc_ar-en
data_files:
- split: train
path: un-pc_ar-en/train-*
- config_name: un-pc_ar-zh
data_files:
- split: train
path: un-pc_ar-zh/train-*
---
|
GrainsPolito/BBBicycles | ---
license: cc-by-nc-4.0
---
# Dataset Card for BBBicycles
## Dataset Summary
Bent & Broken Bicycles (BBBicycles) dataset is a benchmark set for the novel task of **damaged object re-identification**, which aims to identify the same object in multiple images even in the presence of breaks, deformations, and missing parts. You can find an interactive preview [here](https://huggingface.co/spaces/GrainsPolito/BBBicyclesPreview).
## Dataset Structure
The final dataset contains:
- Total of 39,200 image
- 2,800 unique IDs
- 20 models
- 140 IDs for each model
<table border-collapse="collapse">
<tr>
<td><b style="font-size:25px">Information for each ID:</b></td>
<td><b style="font-size:25px">Information for each render:</b></td>
</tr>
<tr>
<td>
<ul>
<li>Model</li>
<li>Type</li>
<li>Texture type</li>
<li>Stickers</li>
</ul>
</td>
<td>
<ul>
<li>Background</li>
<li>Viewing Side</li>
<li>Focal Length</li>
<li>Presence of dirt</li>
</ul>
</td>
</tr>
</table>
### Citation Information
```
@inproceedings{bbb_2022,
title={Bent & Broken Bicycles: Leveraging synthetic data for damaged object re-identification},
author={Luca Piano, Filippo Gabriele Pratticò, Alessandro Sebastian Russo, Lorenzo Lanari, Lia Morra, Fabrizio Lamberti},
booktitle={2022 IEEE Winter Conference on Applications of Computer Vision (WACV)},
year={2022},
organization={IEEE}
}
```
### Credits
The authors gratefully acknowledge the financial support of Reale Mutua Assicurazioni. |
CyberHarem/gudako_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Gudako/ぐだ子/リヨぐだ子/藤丸立香 (Fate/Grand Order)
This is the dataset of Gudako/ぐだ子/リヨぐだ子/藤丸立香 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `orange_hair, hair_between_eyes, ahoge, orange_eyes, hair_ornament, one_side_up, scrunchie, short_hair, hair_scrunchie, breasts, medium_breasts, yellow_eyes, side_ponytail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 656.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gudako_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 580.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gudako_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1209 | 1.14 GiB | [Download](https://huggingface.co/datasets/CyberHarem/gudako_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gudako_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 23 |  |  |  |  |  | 1girl, solo, looking_at_viewer, navel, official_alternate_costume, striped_bikini, striped_clothes, smile, open_mouth, orange_bikini, side-tie_bikini_bottom, cleavage, collarbone, blush, o-ring, outdoors, bare_shoulders, blue_sky, halterneck, ocean, day |
| 1 | 8 |  |  |  |  |  | 1girl, black_gloves, black_jacket, solo, black_scrunchie, command_spell, looking_at_viewer, upper_body, closed_mouth, white_background, simple_background, smile |
| 2 | 8 |  |  |  |  |  | 1girl, black_gloves, black_jacket, black_scrunchie, command_spell, looking_at_viewer, miniskirt, pleated_skirt, solo, uniform, grey_skirt, standing, closed_mouth, cowboy_shot, floating_hair, black_skirt, grey_background, open_mouth, signature |
| 3 | 18 |  |  |  |  |  | 1girl, black_gloves, long_sleeves, solo, command_spell, official_alternate_costume, looking_at_viewer, closed_mouth, white_dress, jacket |
| 4 | 13 |  |  |  |  |  | 1girl, black_skirt, chaldea_uniform, long_sleeves, pleated_skirt, solo, black_pantyhose, belt, looking_at_viewer, smile, blush, orange_scrunchie, yellow_scrunchie, cowboy_shot, white_background, miniskirt, simple_background, open_mouth, white_jacket |
| 5 | 7 |  |  |  |  |  | 1girl, epaulettes, long_sleeves, looking_at_viewer, mini_crown, official_alternate_costume, solo, white_gloves, belt_buckle, blue_jacket, smile, white_bowtie, black_scrunchie, open_mouth, aiguillette, black_belt, black_pantyhose, cowboy_shot, standing, white_dress |
| 6 | 8 |  |  |  |  |  | 1girl, black_necktie, formal, black_jacket, long_sleeves, looking_at_viewer, solo, black_ribbon, hair_ribbon, ponytail, closed_mouth, grey_shirt, alternate_costume, collared_shirt, command_spell, simple_background, upper_body, white_background, black_pants, pant_suit |
| 7 | 6 |  |  |  |  |  | 1girl, black_pantyhose, looking_at_viewer, official_alternate_costume, solo, smile, bow, orange_scrunchie, black_footwear, black_shorts, full_body, long_sleeves, orange_shirt, pantyhose_under_shorts, short_shorts, simple_background, yellow_scrunchie |
| 8 | 18 |  |  |  |  |  | 1girl, red_neckerchief, serafuku, solo, long_sleeves, looking_at_viewer, sailor_collar, pleated_skirt, collarbone, smile, alternate_costume, blush, white_background, blue_skirt, orange_scrunchie, blue_shirt, socks, yellow_scrunchie |
| 9 | 12 |  |  |  |  |  | glasses, long_sleeves, 1girl, green_necktie, looking_at_viewer, school_uniform, solo, beret, black_scrunchie, smile, white_shirt, alternate_costume, miniskirt, collared_shirt, purple_headwear, purple_skirt, purple_vest, black_thighhighs, blush, capelet, cowboy_shot, pencil_skirt |
| 10 | 13 |  |  |  |  |  | 1girl, looking_at_viewer, official_alternate_costume, solo, midriff, navel, white_shorts, chaldea_logo, crop_top, orange_scarf, shirt, simple_background, smile, bike_shorts_under_shorts, blush, bodysuit_under_clothes, long_sleeves, open_mouth, white_background, belt_buckle, brown_belt, command_spell |
| 11 | 15 |  |  |  |  |  | 1girl, alternate_costume, solo, long_sleeves, smile, looking_at_viewer, wide_sleeves, floral_print, print_kimono, obi, flower, holding, blue_kimono, hakama_skirt, open_mouth |
| 12 | 7 |  |  |  |  |  | 1girl, solo, looking_at_viewer, open_mouth, smile, fake_animal_ears, puffy_short_sleeves, rabbit_ears, alternate_costume, blush, black_gloves, black_shorts, cleavage, medium_hair, rabbit_tail, red_bowtie, ribbon, thighhighs, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | navel | official_alternate_costume | striped_bikini | striped_clothes | smile | open_mouth | orange_bikini | side-tie_bikini_bottom | cleavage | collarbone | blush | o-ring | outdoors | bare_shoulders | blue_sky | halterneck | ocean | day | black_gloves | black_jacket | black_scrunchie | command_spell | upper_body | closed_mouth | white_background | simple_background | miniskirt | pleated_skirt | uniform | grey_skirt | standing | cowboy_shot | floating_hair | black_skirt | grey_background | signature | long_sleeves | white_dress | jacket | chaldea_uniform | black_pantyhose | belt | orange_scrunchie | yellow_scrunchie | white_jacket | epaulettes | mini_crown | white_gloves | belt_buckle | blue_jacket | white_bowtie | aiguillette | black_belt | black_necktie | formal | black_ribbon | hair_ribbon | ponytail | grey_shirt | alternate_costume | collared_shirt | black_pants | pant_suit | bow | black_footwear | black_shorts | full_body | orange_shirt | pantyhose_under_shorts | short_shorts | red_neckerchief | serafuku | sailor_collar | blue_skirt | blue_shirt | socks | glasses | green_necktie | school_uniform | beret | white_shirt | purple_headwear | purple_skirt | purple_vest | black_thighhighs | capelet | pencil_skirt | midriff | white_shorts | chaldea_logo | crop_top | orange_scarf | shirt | bike_shorts_under_shorts | bodysuit_under_clothes | brown_belt | wide_sleeves | floral_print | print_kimono | obi | flower | holding | blue_kimono | hakama_skirt | fake_animal_ears | puffy_short_sleeves | rabbit_ears | medium_hair | rabbit_tail | red_bowtie | ribbon | thighhighs |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------|:--------------------|:--------|:-----------------------------|:-----------------|:------------------|:--------|:-------------|:----------------|:-------------------------|:-----------|:-------------|:--------|:---------|:-----------|:-----------------|:-----------|:-------------|:--------|:------|:---------------|:---------------|:------------------|:----------------|:-------------|:---------------|:-------------------|:--------------------|:------------|:----------------|:----------|:-------------|:-----------|:--------------|:----------------|:--------------|:------------------|:------------|:---------------|:--------------|:---------|:------------------|:------------------|:-------|:-------------------|:-------------------|:---------------|:-------------|:-------------|:---------------|:--------------|:--------------|:---------------|:--------------|:-------------|:----------------|:---------|:---------------|:--------------|:-----------|:-------------|:--------------------|:-----------------|:--------------|:------------|:------|:-----------------|:---------------|:------------|:---------------|:-------------------------|:---------------|:------------------|:-----------|:----------------|:-------------|:-------------|:--------|:----------|:----------------|:-----------------|:--------|:--------------|:------------------|:---------------|:--------------|:-------------------|:----------|:---------------|:----------|:---------------|:---------------|:-----------|:---------------|:--------|:---------------------------|:-------------------------|:-------------|:---------------|:---------------|:---------------|:------|:---------|:----------|:--------------|:---------------|:-------------------|:----------------------|:--------------|:--------------|:--------------|:-------------|:---------|:-------------|
| 0 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | | | | | | X | | | | | | | | | | | | | X | X | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 18 |  |  |  |  |  | X | X | X | | X | | | | | | | | | | | | | | | | | X | | | X | | X | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 13 |  |  |  |  |  | X | X | X | | | | | X | X | | | | | X | | | | | | | | | | | | | | X | X | X | X | | | | X | | X | | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | X | X | | X | | | X | X | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | | | | | X | X | | | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | X | X | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 18 |  |  |  |  |  | X | X | X | | | | | X | | | | | X | X | | | | | | | | | | | | | | X | | | X | | | | | | | | | X | | | | | | X | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 12 |  |  |  |  |  | X | X | X | | | | | X | | | | | | X | | | | | | | | | | X | | | | | | X | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 13 |  |  |  |  |  | X | X | X | X | X | | | X | X | | | | | X | | | | | | | | | | | X | | | X | X | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 11 | 15 |  |  |  |  |  | X | X | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | |
| 12 | 7 |  |  |  |  |  | X | X | X | | | | | X | X | | | X | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
mukayese/arc-tr | ---
language:
- tr
license: apache-2.0
---
|
Tristan/t5-small-october-wikipedia-2022-tokenized-512 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 30029601900
num_examples: 9737225
download_size: 9411819822
dataset_size: 30029601900
---
# Dataset Card for "t5-small-october-wikipedia-2022-tokenized-512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jasshl/custom_ADE | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 16486884.0
num_examples: 336
- name: validation
num_bytes: 69672194.779
num_examples: 1347
download_size: 83541043
dataset_size: 86159078.779
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
hippocrates/PatientQA_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 4198675
num_examples: 5942
download_size: 1939016
dataset_size: 4198675
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/mem_oshinoko | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of MEM
This is the dataset of MEM, containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 416 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 416 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 416 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 416 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Tigranchick/falcon_sql_v2 | ---
language:
- en
dataset_info:
features:
- name: input
dtype: string
- name: context
struct:
- name: current_timestamp
dtype: string
- name: schemas
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 752589
num_examples: 577
download_size: 63090
dataset_size: 752589
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
deepapaikar/Updated_SC_YU | ---
license: apache-2.0
---
|
Skarut1945/Gabriel | ---
license: openrail
---
|
Miniex/katieVoz3 | ---
license: openrail
---
|
CyberHarem/amelia_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of amelia (Fire Emblem)
This is the dataset of amelia (Fire Emblem), containing 63 images and their tags.
The core tags of this character are `blonde_hair, short_hair, green_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 63 | 63.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amelia_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 63 | 37.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amelia_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 115 | 69.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amelia_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 63 | 57.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amelia_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 115 | 94.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amelia_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/amelia_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, solo, skirt, open_mouth, zettai_ryouiki, shoulder_armor, gauntlets, full_body, white_background, bangs, smile, chain, holding_weapon, red_cape, battle_axe, black_thighhighs, elbow_gloves, huge_weapon, looking_at_viewer, simple_background |
| 1 | 21 |  |  |  |  |  | 1girl, hetero, solo_focus, penis, 1boy, nipples, sex, blush, open_mouth, vaginal, thighhighs, cum_in_pussy, nude, large_breasts, medium_breasts, mosaic_censoring, sweat, spread_legs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | skirt | open_mouth | zettai_ryouiki | shoulder_armor | gauntlets | full_body | white_background | bangs | smile | chain | holding_weapon | red_cape | battle_axe | black_thighhighs | elbow_gloves | huge_weapon | looking_at_viewer | simple_background | hetero | solo_focus | penis | 1boy | nipples | sex | blush | vaginal | thighhighs | cum_in_pussy | nude | large_breasts | medium_breasts | mosaic_censoring | sweat | spread_legs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-------------|:-----------------|:-----------------|:------------|:------------|:-------------------|:--------|:--------|:--------|:-----------------|:-----------|:-------------|:-------------------|:---------------|:--------------|:--------------------|:--------------------|:---------|:-------------|:--------|:-------|:----------|:------|:--------|:----------|:-------------|:---------------|:-------|:----------------|:-----------------|:-------------------|:--------|:--------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 21 |  |  |  |  |  | X | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/manuela_casagranda_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of manuela_casagranda (Fire Emblem)
This is the dataset of manuela_casagranda (Fire Emblem), containing 324 images and their tags.
The core tags of this character are `short_hair, breasts, brown_hair, mole, brown_eyes, mole_under_eye, large_breasts, eyeshadow, mature_female`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 324 | 403.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manuela_casagranda_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 324 | 217.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manuela_casagranda_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 786 | 458.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manuela_casagranda_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 324 | 349.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manuela_casagranda_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 786 | 660.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manuela_casagranda_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/manuela_casagranda_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 33 |  |  |  |  |  | 1girl, solo, choker, lipstick, cleavage, hair_slicked_back, looking_at_viewer, smile, dress, simple_background |
| 1 | 11 |  |  |  |  |  | 1girl, bell, christmas, cleavage, fake_animal_ears, fake_antlers, hair_slicked_back, lipstick, official_alternate_costume, reindeer_antlers, solo, fishnet_pantyhose, looking_at_viewer, smile, gloves, blush, cape, one_eye_closed, choker, reindeer_costume |
| 2 | 9 |  |  |  |  |  | 1girl, nipples, solo, navel, looking_at_viewer, smile, barefoot, completely_nude, lipstick, simple_background, sitting, white_background |
| 3 | 14 |  |  |  |  |  | 1boy, 1girl, hetero, solo_focus, lipstick, nipples, uncensored, blush, nude, paizuri, hair_slicked_back, choker, huge_breasts, large_penis, smile, cum_on_breasts, facial, heart |
| 4 | 6 |  |  |  |  |  | 1girl, nipples, pussy, solo, spread_legs, anus, nude, smile, uncensored, blush, looking_at_viewer, navel, thighhighs, hair_slicked_back, lipstick |
| 5 | 10 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, penis, vaginal, completely_nude, lipstick, solo_focus, blush, uncensored, pussy, girl_on_top, hair_slicked_back, navel, open_mouth, sex_from_behind, smile, straddling, ass, choker, male_pubic_hair, spread_legs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | choker | lipstick | cleavage | hair_slicked_back | looking_at_viewer | smile | dress | simple_background | bell | christmas | fake_animal_ears | fake_antlers | official_alternate_costume | reindeer_antlers | fishnet_pantyhose | gloves | blush | cape | one_eye_closed | reindeer_costume | nipples | navel | barefoot | completely_nude | sitting | white_background | 1boy | hetero | solo_focus | uncensored | nude | paizuri | huge_breasts | large_penis | cum_on_breasts | facial | heart | pussy | spread_legs | anus | thighhighs | penis | vaginal | girl_on_top | open_mouth | sex_from_behind | straddling | ass | male_pubic_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------|:-----------|:-----------|:--------------------|:--------------------|:--------|:--------|:--------------------|:-------|:------------|:-------------------|:---------------|:-----------------------------|:-------------------|:--------------------|:---------|:--------|:-------|:-----------------|:-------------------|:----------|:--------|:-----------|:------------------|:----------|:-------------------|:-------|:---------|:-------------|:-------------|:-------|:----------|:---------------|:--------------|:-----------------|:---------|:--------|:--------|:--------------|:-------|:-------------|:--------|:----------|:--------------|:-------------|:------------------|:-------------|:------|:------------------|
| 0 | 33 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | | X | | | X | X | | X | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | X | | X | X | | X | | X | | | | | | | | | | | X | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | | X | | X | X | X | | | | | | | | | | | X | | | | X | X | | | | | | | | X | X | | | | | | | X | X | X | X | | | | | | | | |
| 5 | 10 |  |  |  |  |  | X | | X | X | | X | | X | | | | | | | | | | | X | | | | X | X | | X | | | X | X | X | X | | | | | | | | X | X | | | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_FelixChao__Capricorn-7B-DPO | ---
pretty_name: Evaluation run of FelixChao/Capricorn-7B-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FelixChao/Capricorn-7B-DPO](https://huggingface.co/FelixChao/Capricorn-7B-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__Capricorn-7B-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-15T06:20:03.216862](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Capricorn-7B-DPO/blob/main/results_2024-02-15T06-20-03.216862.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6488413542940337,\n\
\ \"acc_stderr\": 0.03217471597461982,\n \"acc_norm\": 0.64849979912089,\n\
\ \"acc_norm_stderr\": 0.03284318919319882,\n \"mc1\": 0.6119951040391677,\n\
\ \"mc1_stderr\": 0.01705876150134798,\n \"mc2\": 0.7723177165257333,\n\
\ \"mc2_stderr\": 0.013804607975615193\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.01338502163731357,\n\
\ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545796\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6967735510854411,\n\
\ \"acc_stderr\": 0.004587128273935072,\n \"acc_norm\": 0.8846843258315077,\n\
\ \"acc_norm_stderr\": 0.0031874975090874164\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933713,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"\
acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135363,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135363\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.01570349834846177,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.01570349834846177\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.0349814938546247,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.0349814938546247\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134128,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134128\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n\
\ \"acc_stderr\": 0.016531170993278884,\n \"acc_norm\": 0.4245810055865922,\n\
\ \"acc_norm_stderr\": 0.016531170993278884\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.012752858346533126,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.012752858346533126\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.018798086284886887,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.018798086284886887\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6119951040391677,\n\
\ \"mc1_stderr\": 0.01705876150134798,\n \"mc2\": 0.7723177165257333,\n\
\ \"mc2_stderr\": 0.013804607975615193\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8310970797158642,\n \"acc_stderr\": 0.010529981411838899\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \
\ \"acc_stderr\": 0.012588685966624174\n }\n}\n```"
repo_url: https://huggingface.co/FelixChao/Capricorn-7B-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|arc:challenge|25_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|gsm8k|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hellaswag|10_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T06-20-03.216862.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T06-20-03.216862.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- '**/details_harness|winogrande|5_2024-02-15T06-20-03.216862.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-15T06-20-03.216862.parquet'
- config_name: results
data_files:
- split: 2024_02_15T06_20_03.216862
path:
- results_2024-02-15T06-20-03.216862.parquet
- split: latest
path:
- results_2024-02-15T06-20-03.216862.parquet
---
# Dataset Card for Evaluation run of FelixChao/Capricorn-7B-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/Capricorn-7B-DPO](https://huggingface.co/FelixChao/Capricorn-7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__Capricorn-7B-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T06:20:03.216862](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Capricorn-7B-DPO/blob/main/results_2024-02-15T06-20-03.216862.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6488413542940337,
"acc_stderr": 0.03217471597461982,
"acc_norm": 0.64849979912089,
"acc_norm_stderr": 0.03284318919319882,
"mc1": 0.6119951040391677,
"mc1_stderr": 0.01705876150134798,
"mc2": 0.7723177165257333,
"mc2_stderr": 0.013804607975615193
},
"harness|arc:challenge|25": {
"acc": 0.7005119453924915,
"acc_stderr": 0.01338502163731357,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545796
},
"harness|hellaswag|10": {
"acc": 0.6967735510854411,
"acc_stderr": 0.004587128273935072,
"acc_norm": 0.8846843258315077,
"acc_norm_stderr": 0.0031874975090874164
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933713,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.0315841532404771,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.0315841532404771
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135363,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135363
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.01570349834846177,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.01570349834846177
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.0349814938546247,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.0349814938546247
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4245810055865922,
"acc_stderr": 0.016531170993278884,
"acc_norm": 0.4245810055865922,
"acc_norm_stderr": 0.016531170993278884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533126,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533126
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.018798086284886887,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.018798086284886887
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6119951040391677,
"mc1_stderr": 0.01705876150134798,
"mc2": 0.7723177165257333,
"mc2_stderr": 0.013804607975615193
},
"harness|winogrande|5": {
"acc": 0.8310970797158642,
"acc_stderr": 0.010529981411838899
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624174
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
allenai/paloma | ---
extra_gated_prompt: "Access to this dataset is automatically granted upon accepting the [**AI2 ImpACT License – Low Risk Artifacts (“LR Agreement”)**](https://allenai.org/licenses/impact-lr) and completing all fields below. All data subsets in this dataset are licensed under the LR Agreement, except for those as listed in the 'License' section of the Dataset Card."
extra_gated_fields:
Your full name: text
Organization or entity you are affiliated with: text
State or country you are located in: text
Contact email: text
Please describe your intended use of the low risk artifact(s): text
I AGREE to the terms and conditions of the LR Agreement above: checkbox
I AGREE to AI2’s use of my information for legal notices and administrative matters: checkbox
I CERTIFY that the information I have provided is true and accurate: checkbox
dataset_info:
- config_name: 4chan_meta_sep
features:
- name: text
dtype: string
- name: id
dtype: string
- name: added
dtype: string
- name: source
dtype: string
- name: metadata
struct:
- name: original_ids
sequence: int64
- name: original_times
sequence: int64
- name: semantic_url
dtype: string
- name: truncated_portion
dtype: string
- config_name: c4_100_domains
features:
- name: text
dtype: string
- name: id
dtype: string
- name: added
dtype: string
- name: source
dtype: string
- name: subdomain
dtype: string
- config_name: c4_en
features:
- name: text
dtype: string
- name: id
dtype: string
- name: added
dtype: string
- name: source
dtype: string
- name: metadata
struct:
- name: url
dtype: string
- name: date
dtype: string
- name: truncated_portion
dtype: string
- config_name: dolma-v1_5
features:
- name: text
dtype: string
- name: id
dtype: string
- name: added
dtype: string
- name: source
dtype: string
- name: subdomain
dtype: string
- name: metadata
dtype: struct
- config_name: dolma_100_programming_languages_no_attributes
features:
- name: text
dtype: string
- name: id
dtype: string
- name: added
dtype: string
- name: source
dtype: string
- name: subdomain
dtype: string
- name: metadata
dtype: struct
- name: timestamp
dtype: timestamp[s]
configs:
- config_name: 4chan_meta_sep
data_files:
- split: val
path: "4chan_meta_sep/val/*"
- split: test
path: "4chan_meta_sep/test/*"
- config_name: c4_100_domains
data_files:
- split: val
path: "c4_100_domains/val/*"
- split: test
path: "c4_100_domains/test/*"
- config_name: c4_en
data_files:
- split: val
path: "c4_en/val/*"
- split: test
path: "c4_en/test/*"
- config_name: dolma-v1_5
data_files:
- split: val
path: "dolma-v1_5/val/*"
- split: test
path: "dolma-v1_5/test/*"
- config_name: dolma_100_programming_languages_no_attributes
data_files:
- split: val
path: "dolma_100_programming_languages_no_attributes/val/*"
- split: test
path: "dolma_100_programming_languages_no_attributes/test/*"
- config_name: dolma_100_subreddits
data_files:
- split: val
path: "dolma_100_subreddits/val/*"
- split: test
path: "dolma_100_subreddits/test/*"
- config_name: falcon-refinedweb
data_files:
- split: val
path: "falcon-refinedweb/val/*"
- split: test
path: "falcon-refinedweb/test/*"
- config_name: gab
data_files:
- split: val
path: "gab/val/*"
- split: test
path: "gab/test/*"
- config_name: m2d2_s2orc_unsplit
data_files:
- split: val
path: "m2d2_s2orc_unsplit/val/*"
- split: test
path: "m2d2_s2orc_unsplit/test/*"
- config_name: m2d2_wikipedia_unsplit
data_files:
- split: val
path: "m2d2_wikipedia_unsplit/val/*"
- split: test
path: "m2d2_wikipedia_unsplit/test/*"
- config_name: manosphere_meta_sep
data_files:
- split: val
path: "manosphere_meta_sep/val/*"
- split: test
path: "manosphere_meta_sep/test/*"
- config_name: mc4
data_files:
- split: val
path: "mc4/val/*"
- split: test
path: "mc4/test/*"
- config_name: ptb
data_files:
- split: val
path: "ptb/val/*"
- split: test
path: "ptb/test/*"
- config_name: redpajama
data_files:
- split: val
path: "redpajama/val/*"
- split: test
path: "redpajama/test/*"
- config_name: twitterAAE_HELM_fixed
data_files:
- split: val
path: "twitterAAE_HELM_fixed/val/*"
- split: test
path: "twitterAAE_HELM_fixed/test/*"
- config_name: wikitext_103
data_files:
- split: val
path: "wikitext_103/val/*"
- split: test
path: "wikitext_103/test/*"
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
Language models (LMs) commonly report perplexity on monolithic data held out from the training distribution.
Implicitly or explicitly, this data is composed of domains—variations in the distribution of language. Rather than assuming perplexity on one distribution extrapolates to others, Perplexity Analysis for Language Model Assessment (Paloma) measures LM fit to 585 text domains, ranging from NY Times to r/depression on Reddit.
## Dataset Details
### Benchmark Inference and Submissions
We invite submissions to our benchmark and organize results by comparability based on compliance with guidelines such as the removal of benchmark contamination from pretraining. Standardized inference code for running comprable evaluations and details about making submissions to the Paloma benchmark can be found at the following link.
[How to evaluate and how to submit](https://github.com/allenai/ai2-olmo-eval/blob/main/paloma/README.md)
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
Paloma is for examining relative differences in LM fit on domains. We take these relative differences as a proxy of model fit to the shared knowledge, values, and social context that position the humans producing language in a domain. While we expect contemporary LMs to have a limited fit to the most complex of these latent factors of domains, improving fit to all factors is necessary both to improve perplexity and for any actual use of the LM. For example, better perplexity on a particular dialect of English suggests that that model will make a better chatbot for people that speak that dialect.
The sources of evaluation data in Paloma were selected based on the following desiderata: 1) including known resources, 2) including fine-grained domains, 3) including domains representing specific communities of interest. Different lines of research will require different selections of domains; Paloma aims to enable research on differences in LM fit over the hundreds of domains that are readily available in existing metadata.
Note that we are not able to re-host 2 of the 18 sources in Paloma comprising 39 domains. These are The Pile and ICE. The ICE corpus is available on request to the original authors following the instructions [here](https://www.ice-corpora.uzh.ch/en/access.html).
**Curated by:** Ian Magnusson, Akshita Bhagia, Valentin Hofmann, Luca Soldaini, Ananya Harsh Jha, Oyvind Tafjord, Dustin Schwenk, Evan Pete Walsh, Yanai Elazar, Kyle Lo, Dirk Groeneveld, Iz Beltagy, Hannaneh Hajishirzi, Noah A. Smith, Kyle Richardson, and Jesse Dodge
**Languages:** We elect to focus just on the language modeling of English and code data.
**License:** The data subsets are licensed under the AI2 ImpACT License - Low Risk Artifacts, except as listed below.
- Wikitext-103 - CC BY-SA
- TwitterAAE - for research purposes only
- Red Pajama - see license details
- M2D2 - CC BY-NC
**Paper:** https://arxiv.org/abs/2312.10523
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
<!-- - [Paper]() -- (TODO update when paper is preprinted) -->
<!-- - [Website](paloma.allen.ai) -->
- [Code](https://github.com/allenai/ai2-olmo-eval/blob/main/paloma/README.md)
- Paloma 1B Baseline Models: [Dolma](https://huggingface.co/allenai/paloma-1b-baseline-dolma), [Pile](https://huggingface.co/allenai/paloma-1b-baseline-pile), [RedPajama](https://huggingface.co/allenai/paloma-1b-baseline-redpajama), [C4](https://huggingface.co/allenai/paloma-1b-baseline-c4), [mC4-en](https://huggingface.co/allenai/paloma-1b-baseline-mc4), [Falcon-RefinedWeb](https://huggingface.co/allenai/paloma-1b-baseline-falcon-refinedweb)
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
This benchmark is intended for use in evaluating language model fit to fine-grained domains.
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
This dataset should be used for evaluating the likilihood of text from a given domain by a language model.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
Note that the sources contained in this benchmark include varying licenses with differing restrictions (see [License](#dataset-description))
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
The sources in this dataset are each organized into their own subcorpus. This consists of a `val` and `test` split. Data within this is organized as files with lines separated JSON data where each line represents a document and its associated metadata. The type of metadata available varies from source to source, but each line contains at least a field `'text'` which contains the text of the document.
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
Perplexity is conventionally reported on held out data from a model's training distribution or a small number of traditional test sets. Such monolithic evaluation ignores potential variation of model fit across different domains that LMs implicitly learn to model. We curate sources of fine-grained textual domains in Paloma to enable evaluation of language model fit to specific domains of text. Paloma is inspired by and incorporates previous work that curates corpora with marked domains (The Pile, M2D2, C4 100 Domains, ICE, TwitterAAE). We conduct a stratified subsample over domains where we set a minimum subsample size based on emperical estimation of the variance over subsamples.
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Standard language modeling benchmarks
Though it is common practice to evaluate on held out data from the pretraining corpus of a given model, we evaluate *across* several major pretraining corpora and standard language modeling benchmarks. We also break down performance per domain within the datasets that have multiple domains. Note that although the Paloma benchmark analysis in our paper describes results on the Pile, we are not able to re-host this data.
| Source | Citation | Description |
|-------------------|-----------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| c4-en | Raffel et al (2019) via Dodge et al (2021) | Standard contemporary LM pretraining corpus automatically filtered from the April 2019 Common Crawl scrape |
| mc4-en | Xue et al (2021) | The English language portion of a pretraining corpus automatically filtered from 71 Common Crawl scrapes |
| Pile | Gao et al (2020) | Standard contemporary LM benchmark from curated multi-source data including large scale non-webscraped sources |
| Wikitext-103 | Merity et al (2016) | A standard collection of verified “Good” and “Featured” articles on Wikipedia |
| Penn Tree Bank | Marcus et al (1999) via Nunes, Davide. (2020) | Classic Wall Street Journal benchmark with linguistic structure annotations omitted |
| RedPajama | Together Computer (2023) | A publicly available reproduction of the LLaMA (Touvron et al., 2023) pretraining source mixture, combining large amounts of webscraped text with smaller curated sources |
| Falcon-RefinedWeb | Penedo et al. (2023) | A corpus of English sampled from all Common Crawl scrapes until June 2023, more aggressively filtered and deduplicated than c4 and mc4-en |
| Dolma v1.5 | Soldaini et al. (2023) | A three trillion token corpus that samples sources commonly used to train LMs in order to enable open research on pretraining data |
#### Fine-grained domain benchmarks
Where typical pretraining corpora offer at most tens of labeled domains usually based on where the data is sourced, we examine datasets with up to an order of magnitude more domains. Existing datasets (M2D2 and c4 100 Domains) and datasets we curate from Dolma v1.5 use metadata to define hundreds of domains over Wikipedia, Semantic Scholar, Common Crawl, Reddit, and Github data. These include diverse domains from *Culture and the arts: Performing arts*, a topic on Wikipedia, to *r/depression*, a forum on Reddit for mental health support.
| Source | Citation | Description |
|---------------------------------|--------------------------------------------------|-----------------------------------------------------------------------------------|
| M2D2 S2ORC | Reid et al (2022) | Papers from Semantic Scholar grouped by hierarchical academic field categories |
| M2D2 Wiki | Reid et al (2022) | Wikipedia articles grouped by hierarchical categories in the Wikipedia ontology |
| c4 100 Domains | Chronopoulou et al (2021) | Balanced samples of the top 100 URL domains in C4 |
| Dolma 100 Subreddits | Soldaini et al. (2023) | Balanced samples of the top 100 Subreddits from the Dolma Reddit subset |
| Dolma 100 Programming Languages | Kocetkov et al. (2022) via Soldaini et al. (2023) | Balanced samples of the top 100 programming languages from the Dolma Stack subset |
#### Disparities between speech communities
Some communities are known to be underserved by existing models. Following HELM, We measure disparities in performance on corpora of African American English and White aligned English from TwitterAAE, as well as nine corpora of English from different countries with the ICE dataset. Note that although the Paloma benchmark analysis in our paper describes results on ICE, we are not able to re-host this data.
| Source | Citation | Description |
|------------|----------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| ICE | Greenbaum and Nelson (1996) via Liang et al (2022) | English from around the world curated by local experts, with subsets for Canada, East Africa, Hong Kong, India, Ireland, Jamaica, Philippines, Singapore, and the USA |
| TwitterAAE | Blodgett et al. (2016) via Liang et al (2022) | Balanced sets of tweets classified as African American or White aligned English |
#### Fringe sources previously studied for problematic discourse
Text from some fringe online communities has been shown to contain larger proportions of hate speech and toxicity than more mainstream sources. [Longpre et al. (2023)](https://arxiv.org/abs/2305.13169) have shown that varying amount of toxic content in pretraining data exhibits a tradeoff between non-toxic generation and ability to classify toxicity, indicating that model fit to discourse with toxicity is worth measuring. Measuring perplexity on Manosphere, Gab, and 4chan characterises model familiarity with distinct social contexts in which toxic language arises.
| Source | Citation | Description |
|-------------------|------------------------|---------------------------------------------------------------------------------------------------------------------------------------------|
| Manosphere Corpus | Ribeiro et al (2020) | 9 forums where a set of related masculinist ideologies developed over the 2000s and 2010s |
| Gab Corpus | Zannettou et al (2018) | Data from 2016-18 from an alt-right, free-speech-oriented social media platform shown to contain more hate speech than mainstream platforms |
| 4chan Corpus | Papasavva et al (2020) | Data from 2016-19 from a politics subforum of an anonymity-focused forum found to contain among the highest rates of toxic content |
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
The data in Paloma are sampled from existing sources. Most often perplexity evaluation data is subsampled uniformly over the original distribution of domains in a source, resulting in more or less tokens from each domain in the evaluation data based on how well represented they are in the corpus. We instead employ stratified sampling, in which all sources with marked domains are partitioned by domain and a uniform sample of the same size is taken from each partition. Specifically, documents are sampled from each domain until a target number of tokens is reached. This helps ensure that no domains are lost or very small after subsampling.
In social media domains with additional metadata that is typically displayed along with posts, we format metadata such as timestamps into the document `'text'` field. Where information is available about how threads of posts are connected, documents in that domain contain all posts in a given thread.
Additional details on source specific processing are available in our paper.
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
Text data from each of the sources curated in Paloma is created by varying sets of original authors. Some sources are collected from users of specific internet fora such as specific subreddits. Other data is collected on the basis of expert or automated classification of demographic groups. Other data is collected from authors of archival material including scientific preprints, Wikipedia, and code repositories. Lastly, data sampled from standard pretraining corpora comes from authors collected through automatic webscrapping and large scale sampling of archival sources, making it difficult to recover much specific information about these authors.
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
No annotation is done on this data.
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
No annotation is done on this data.
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
Sources in Paloma may contain personally identifiable information (PII). No attempt is made to measure or remove this information for the following reason: Paloma provides a small subsample of already publicly available data. The small size of this subsample renders this data less useful for aggregation of PII information than the already available public sources which we subsample.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
It is beyond the scope of any one group of researchers to prescribe an exhaustive set of domains that should be examined for a LM. Rather Paloma brings together a substantial selection of domains that are identifiable from already available metadata to demonstrate the kinds of analyses possible with hundreds of domains and rigorous experimental controls.
Different research goals will motivate different definitions and selections of domains, but other researchers can apply the guidelines we detail in our paper to novel fine-grained domains suitable for their research questions. One of the key advantages of evaluating a model by its fit to a collection of text representing a domain is that such domains can be identified not just by researchers who study LMs. We hope future work will identify many more domains that no one discipline would think to look at.
In Paloma, we distinguish sources from domains, although not all cases permit such easy distinction. We use *source* to refer to a selection of data that is characterized by the decisions of the people who curated that data, whether that curation is automatic as in scraping C4 or manual as in selecting the subcorpora of the The Pile. By contrast we use *domain* to refer to a set of documents that belong together because they are originally produced by a group of humans that share a distinct social context. Considered as such, domains may overlap; a document's author may belong to the set of English speakers in Jamaica and the set of AI researchers. Further note, that domains are often latent categorizations which we only approximate because complete metadata does not exist.
Also, some domains in Paloma appear in multiple sources, such as academic papers. Though The Pile and RedPajama process academic papers differently, the subcorpora on academic papers in each source represent different approximations of the same or very similar domains. However for the sake of simplicity, we make the reductive assumption of counting all 585 domains in Paloma as fully distinct.
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
In our paper we outline guidelines for evaluating language model fit. We encourage users of Paloma to adopt these experimental controls for metric variance when subsampling, benchmark contamination, differing tokenization, training data order, and evaluation data format.
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```
@article{paloma,
title={{Paloma}: A Benchmark for Evaluating Language Model Fit},
author={Magnusson, Ian and Bhagia, Akshita and Hofmann, Valentin and Soldaini, Luca and Harsh Jha, Ananya and Tafjord, Oyvind and Schwenk,Dustin and Walsh, Evan Pete and Elazar, Yanai and Lo, Kyle and Groenveld,Dirk and Beltagy,Iz and Hajishirz,Hanneneh and Smith, Noah A. and Richardson,Kyle and Dodge,Jesse},
journal={technical report},
year={2023},
url={https://paloma.allen.ai/}
}
```
<!-- [More Information Needed] -->
## Dataset Card Contact
{ianm,jessed}@allenai.org |
liuyanchen1015/MULTI_VALUE_cola_who_as | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1980
num_examples: 23
- name: test
num_bytes: 1352
num_examples: 17
- name: train
num_bytes: 21437
num_examples: 245
download_size: 17708
dataset_size: 24769
---
# Dataset Card for "MULTI_VALUE_cola_who_as"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vphu123/libri_whisper_base | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: input_length
dtype: float64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 27418458704
num_examples: 28539
- name: val
num_bytes: 2596462440
num_examples: 2703
- name: test
num_bytes: 2516732696
num_examples: 2620
download_size: 4886012298
dataset_size: 32531653840
---
# Dataset Card for "libri_whisper_base"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ReadingTimeMachine/rtm-sgt-ocr-v1 | ---
license: apache-2.0
task_categories:
- text-classification
- translation
language:
- en
size_categories:
- 1M<n<10M
---
## Data Introduction
Over 1.5 Million synthetically generated ground-truth/[OCR](https://en.wikipedia.org/wiki/Optical_character_recognition) pairs for post correction tasks from our paper "[Large Synthetic Data from the ar𝜒iv for OCR Post Correction of Historic Scientific Articles](https://dl.acm.org/doi/10.1007/978-3-031-43849-3_23)".
Synthetic ground truth (SGT) sentences have been mined from the [ar𝜒iv Bulk Downloads](https://info.arxiv.org/help/bulk_data/index.html) source documents,
and Optical Character Recognition (OCR)
sentences have been generated with the [Tesseract](https://github.com/tesseract-ocr/tesseract) OCR engine on the PDF pages generated from compiled source documents.
SGT/OCR pairs come from astronomy articles in the years 1991-2011.
No page augmentation has been applied to any of the PDF documents (i.e. these are "clean" pages without warping, dust, etc.)
## Resources
### Dataset Versions
* V0 (original released with original paper) is available [here](https://zenodo.org/records/8006584)
## Citation
Please reference the following if you make use of this dataset:
```
@inproceedings{10.1007/978-3-031-43849-3_23,
author = {Naiman, J. P. and Cosillo, Morgan G. and Williams, Peter K. G. and Goodman, Alyssa},
title = {Large Synthetic Data From the arχiv For OCR Post Correction Of Historic Scientific Articles},
year = {2023},
isbn = {978-3-031-43848-6},
publisher = {Springer-Verlag},
address = {Berlin, Heidelberg},
url = {https://doi.org/10.1007/978-3-031-43849-3_23},
doi = {10.1007/978-3-031-43849-3_23},
abstract = {Historical scientific articles often require Optical Character Recognition (OCR) to transform scanned documents into machine-readable text, a process that often produces errors. We present a pipeline for the generation of a synthetic ground truth/OCR dataset to correct the OCR results of the astrophysics literature holdings of the NASA Astrophysics Data System (ADS). By mining the arχiv we create, to the authors’ knowledge, the largest scientific synthetic ground truth/OCR post correction dataset of 203,354,393 character pairs. Baseline models trained with this dataset find the mean improvement in character and word error rates of 7.71\% and 18.82\% for historical OCR text, respectively. Interactive dashboards to explore the dataset are available online: , and data and code, are hosted on GitHub: .},
booktitle = {Linking Theory and Practice of Digital Libraries: 27th International Conference on Theory and Practice of Digital Libraries, TPDL 2023, Zadar, Croatia, September 26–29, 2023, Proceedings},
pages = {265–274},
numpages = {10},
keywords = {scholarly document processing, optical character recognition, astronomy},
location = {Zadar, Croatia}
}
```
|
JesusPorto/autotrain-data-cilantroperejil | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: cilantroperejil
## Dataset Description
This dataset has been automatically processed by AutoTrain for project cilantroperejil.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<474x410 RGB PIL image>",
"target": 0
},
{
"image": "<474x575 RGB PIL image>",
"target": 1
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['cilantro', 'perejil'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 160 |
| valid | 40 |
|
open-llm-leaderboard/details_abacusai__Smaug-2-72B | ---
pretty_name: Evaluation run of abacusai/Smaug-2-72B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abacusai/Smaug-2-72B](https://huggingface.co/abacusai/Smaug-2-72B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abacusai__Smaug-2-72B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-30T01:31:21.040468](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__Smaug-2-72B/blob/main/results_2024-03-30T01-31-21.040468.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.767796907393265,\n\
\ \"acc_stderr\": 0.027982043283656534,\n \"acc_norm\": 0.7771325485017166,\n\
\ \"acc_norm_stderr\": 0.02849243108304867,\n \"mc1\": 0.4749082007343941,\n\
\ \"mc1_stderr\": 0.017481446804104003,\n \"mc2\": 0.6489991000331015,\n\
\ \"mc2_stderr\": 0.015361203699885669\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6467576791808873,\n \"acc_stderr\": 0.013967822714840055,\n\
\ \"acc_norm\": 0.6791808873720137,\n \"acc_norm_stderr\": 0.013640943091946528\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6796454889464251,\n\
\ \"acc_stderr\": 0.004656591678606762,\n \"acc_norm\": 0.8636725751842262,\n\
\ \"acc_norm_stderr\": 0.0034243464481037195\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \
\ \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8037735849056604,\n \"acc_stderr\": 0.02444238813110083,\n\
\ \"acc_norm\": 0.8037735849056604,\n \"acc_norm_stderr\": 0.02444238813110083\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9166666666666666,\n\
\ \"acc_stderr\": 0.023112508176051236,\n \"acc_norm\": 0.9166666666666666,\n\
\ \"acc_norm_stderr\": 0.023112508176051236\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.04897104952726367,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.04897104952726367\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n\
\ \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.774468085106383,\n \"acc_stderr\": 0.027321078417387536,\n\
\ \"acc_norm\": 0.774468085106383,\n \"acc_norm_stderr\": 0.027321078417387536\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5701754385964912,\n\
\ \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.5701754385964912,\n\
\ \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.0333333333333333,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.0333333333333333\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.708994708994709,\n\
\ \"acc_stderr\": 0.023393826500484865,\n \"acc_norm\": 0.708994708994709,\n\
\ \"acc_norm_stderr\": 0.023393826500484865\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.5873015873015873,\n \"acc_stderr\": 0.04403438954768177,\n\
\ \"acc_norm\": 0.5873015873015873,\n \"acc_norm_stderr\": 0.04403438954768177\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.896774193548387,\n \"acc_stderr\": 0.017308381281034516,\n\
\ \"acc_norm\": 0.896774193548387,\n \"acc_norm_stderr\": 0.017308381281034516\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6650246305418719,\n \"acc_stderr\": 0.033208527423483104,\n \"\
acc_norm\": 0.6650246305418719,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865387,\n\
\ \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865387\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9292929292929293,\n \"acc_stderr\": 0.01826310542019949,\n \"\
acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.01826310542019949\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9896373056994818,\n \"acc_stderr\": 0.007308424386792208,\n\
\ \"acc_norm\": 0.9896373056994818,\n \"acc_norm_stderr\": 0.007308424386792208\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8179487179487179,\n \"acc_stderr\": 0.0195652367829309,\n \
\ \"acc_norm\": 0.8179487179487179,\n \"acc_norm_stderr\": 0.0195652367829309\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.5111111111111111,\n \"acc_stderr\": 0.030478009819615823,\n \
\ \"acc_norm\": 0.5111111111111111,\n \"acc_norm_stderr\": 0.030478009819615823\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8697478991596639,\n \"acc_stderr\": 0.02186325849485212,\n \
\ \"acc_norm\": 0.8697478991596639,\n \"acc_norm_stderr\": 0.02186325849485212\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5231788079470199,\n \"acc_stderr\": 0.04078093859163086,\n \"\
acc_norm\": 0.5231788079470199,\n \"acc_norm_stderr\": 0.04078093859163086\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9247706422018349,\n \"acc_stderr\": 0.011308662537571762,\n \"\
acc_norm\": 0.9247706422018349,\n \"acc_norm_stderr\": 0.011308662537571762\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.7175925925925926,\n \"acc_stderr\": 0.03070137211151092,\n \"\
acc_norm\": 0.7175925925925926,\n \"acc_norm_stderr\": 0.03070137211151092\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9117647058823529,\n \"acc_stderr\": 0.01990739979131694,\n \"\
acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.01990739979131694\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9156118143459916,\n \"acc_stderr\": 0.01809424711647332,\n \
\ \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.01809424711647332\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n\
\ \"acc_stderr\": 0.02624113299640725,\n \"acc_norm\": 0.8116591928251121,\n\
\ \"acc_norm_stderr\": 0.02624113299640725\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\
\ \"acc_stderr\": 0.03434300243630999,\n \"acc_norm\": 0.8518518518518519,\n\
\ \"acc_norm_stderr\": 0.03434300243630999\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.852760736196319,\n \"acc_stderr\": 0.027839915278339653,\n\
\ \"acc_norm\": 0.852760736196319,\n \"acc_norm_stderr\": 0.027839915278339653\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5892857142857143,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.5892857142857143,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n\
\ \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9487179487179487,\n\
\ \"acc_stderr\": 0.014450181176872735,\n \"acc_norm\": 0.9487179487179487,\n\
\ \"acc_norm_stderr\": 0.014450181176872735\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826372,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826372\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9169859514687101,\n\
\ \"acc_stderr\": 0.009866287394639541,\n \"acc_norm\": 0.9169859514687101,\n\
\ \"acc_norm_stderr\": 0.009866287394639541\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8352601156069365,\n \"acc_stderr\": 0.019971040982442272,\n\
\ \"acc_norm\": 0.8352601156069365,\n \"acc_norm_stderr\": 0.019971040982442272\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7262569832402235,\n\
\ \"acc_stderr\": 0.01491241309637243,\n \"acc_norm\": 0.7262569832402235,\n\
\ \"acc_norm_stderr\": 0.01491241309637243\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02082375883758091,\n\
\ \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02082375883758091\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8231511254019293,\n\
\ \"acc_stderr\": 0.021670058885510792,\n \"acc_norm\": 0.8231511254019293,\n\
\ \"acc_norm_stderr\": 0.021670058885510792\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8580246913580247,\n \"acc_stderr\": 0.019420260109438287,\n\
\ \"acc_norm\": 0.8580246913580247,\n \"acc_norm_stderr\": 0.019420260109438287\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6134751773049646,\n \"acc_stderr\": 0.029049190342543465,\n \
\ \"acc_norm\": 0.6134751773049646,\n \"acc_norm_stderr\": 0.029049190342543465\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6023468057366362,\n\
\ \"acc_stderr\": 0.012499840347460643,\n \"acc_norm\": 0.6023468057366362,\n\
\ \"acc_norm_stderr\": 0.012499840347460643\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8161764705882353,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.8161764705882353,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.815359477124183,\n \"acc_stderr\": 0.01569702924075778,\n \
\ \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.01569702924075778\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.7727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8244897959183674,\n \"acc_stderr\": 0.02435280072297001,\n\
\ \"acc_norm\": 0.8244897959183674,\n \"acc_norm_stderr\": 0.02435280072297001\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759026,\n \
\ \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759026\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.02353755765789256,\n\
\ \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.02353755765789256\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4749082007343941,\n\
\ \"mc1_stderr\": 0.017481446804104003,\n \"mc2\": 0.6489991000331015,\n\
\ \"mc2_stderr\": 0.015361203699885669\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8161010260457774,\n \"acc_stderr\": 0.010887916013305894\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.38514025777103866,\n \
\ \"acc_stderr\": 0.013404165536474305\n }\n}\n```"
repo_url: https://huggingface.co/abacusai/Smaug-2-72B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|arc:challenge|25_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|gsm8k|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hellaswag|10_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T01-31-21.040468.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T01-31-21.040468.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- '**/details_harness|winogrande|5_2024-03-30T01-31-21.040468.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-30T01-31-21.040468.parquet'
- config_name: results
data_files:
- split: 2024_03_30T01_31_21.040468
path:
- results_2024-03-30T01-31-21.040468.parquet
- split: latest
path:
- results_2024-03-30T01-31-21.040468.parquet
---
# Dataset Card for Evaluation run of abacusai/Smaug-2-72B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abacusai/Smaug-2-72B](https://huggingface.co/abacusai/Smaug-2-72B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abacusai__Smaug-2-72B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-30T01:31:21.040468](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__Smaug-2-72B/blob/main/results_2024-03-30T01-31-21.040468.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.767796907393265,
"acc_stderr": 0.027982043283656534,
"acc_norm": 0.7771325485017166,
"acc_norm_stderr": 0.02849243108304867,
"mc1": 0.4749082007343941,
"mc1_stderr": 0.017481446804104003,
"mc2": 0.6489991000331015,
"mc2_stderr": 0.015361203699885669
},
"harness|arc:challenge|25": {
"acc": 0.6467576791808873,
"acc_stderr": 0.013967822714840055,
"acc_norm": 0.6791808873720137,
"acc_norm_stderr": 0.013640943091946528
},
"harness|hellaswag|10": {
"acc": 0.6796454889464251,
"acc_stderr": 0.004656591678606762,
"acc_norm": 0.8636725751842262,
"acc_norm_stderr": 0.0034243464481037195
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8037735849056604,
"acc_stderr": 0.02444238813110083,
"acc_norm": 0.8037735849056604,
"acc_norm_stderr": 0.02444238813110083
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.023112508176051236,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.023112508176051236
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.04897104952726367,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.04897104952726367
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.774468085106383,
"acc_stderr": 0.027321078417387536,
"acc_norm": 0.774468085106383,
"acc_norm_stderr": 0.027321078417387536
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5701754385964912,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.5701754385964912,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8,
"acc_stderr": 0.0333333333333333,
"acc_norm": 0.8,
"acc_norm_stderr": 0.0333333333333333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.708994708994709,
"acc_stderr": 0.023393826500484865,
"acc_norm": 0.708994708994709,
"acc_norm_stderr": 0.023393826500484865
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5873015873015873,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.5873015873015873,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.017308381281034516,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.017308381281034516
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6650246305418719,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.6650246305418719,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865387,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865387
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.01826310542019949,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.01826310542019949
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9896373056994818,
"acc_stderr": 0.007308424386792208,
"acc_norm": 0.9896373056994818,
"acc_norm_stderr": 0.007308424386792208
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8179487179487179,
"acc_stderr": 0.0195652367829309,
"acc_norm": 0.8179487179487179,
"acc_norm_stderr": 0.0195652367829309
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.030478009819615823,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.030478009819615823
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8697478991596639,
"acc_stderr": 0.02186325849485212,
"acc_norm": 0.8697478991596639,
"acc_norm_stderr": 0.02186325849485212
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5231788079470199,
"acc_stderr": 0.04078093859163086,
"acc_norm": 0.5231788079470199,
"acc_norm_stderr": 0.04078093859163086
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9247706422018349,
"acc_stderr": 0.011308662537571762,
"acc_norm": 0.9247706422018349,
"acc_norm_stderr": 0.011308662537571762
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7175925925925926,
"acc_stderr": 0.03070137211151092,
"acc_norm": 0.7175925925925926,
"acc_norm_stderr": 0.03070137211151092
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.01990739979131694,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.01990739979131694
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9156118143459916,
"acc_stderr": 0.01809424711647332,
"acc_norm": 0.9156118143459916,
"acc_norm_stderr": 0.01809424711647332
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.02624113299640725,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.02624113299640725
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243630999,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243630999
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.852760736196319,
"acc_stderr": 0.027839915278339653,
"acc_norm": 0.852760736196319,
"acc_norm_stderr": 0.027839915278339653
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5892857142857143,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.5892857142857143,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9487179487179487,
"acc_stderr": 0.014450181176872735,
"acc_norm": 0.9487179487179487,
"acc_norm_stderr": 0.014450181176872735
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826372,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826372
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9169859514687101,
"acc_stderr": 0.009866287394639541,
"acc_norm": 0.9169859514687101,
"acc_norm_stderr": 0.009866287394639541
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8352601156069365,
"acc_stderr": 0.019971040982442272,
"acc_norm": 0.8352601156069365,
"acc_norm_stderr": 0.019971040982442272
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7262569832402235,
"acc_stderr": 0.01491241309637243,
"acc_norm": 0.7262569832402235,
"acc_norm_stderr": 0.01491241309637243
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02082375883758091,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02082375883758091
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8231511254019293,
"acc_stderr": 0.021670058885510792,
"acc_norm": 0.8231511254019293,
"acc_norm_stderr": 0.021670058885510792
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8580246913580247,
"acc_stderr": 0.019420260109438287,
"acc_norm": 0.8580246913580247,
"acc_norm_stderr": 0.019420260109438287
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6134751773049646,
"acc_stderr": 0.029049190342543465,
"acc_norm": 0.6134751773049646,
"acc_norm_stderr": 0.029049190342543465
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6023468057366362,
"acc_stderr": 0.012499840347460643,
"acc_norm": 0.6023468057366362,
"acc_norm_stderr": 0.012499840347460643
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8161764705882353,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.8161764705882353,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.815359477124183,
"acc_stderr": 0.01569702924075778,
"acc_norm": 0.815359477124183,
"acc_norm_stderr": 0.01569702924075778
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8244897959183674,
"acc_stderr": 0.02435280072297001,
"acc_norm": 0.8244897959183674,
"acc_norm_stderr": 0.02435280072297001
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.021904291355759026,
"acc_norm": 0.95,
"acc_norm_stderr": 0.021904291355759026
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.02353755765789256,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.02353755765789256
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4749082007343941,
"mc1_stderr": 0.017481446804104003,
"mc2": 0.6489991000331015,
"mc2_stderr": 0.015361203699885669
},
"harness|winogrande|5": {
"acc": 0.8161010260457774,
"acc_stderr": 0.010887916013305894
},
"harness|gsm8k|5": {
"acc": 0.38514025777103866,
"acc_stderr": 0.013404165536474305
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
trec-product-search/product-search-corpus | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
tags:
- information retrieval
- product search
- dense retrieval
pretty_name: TREC Product Search Corpus
size_categories:
- 1M<n<10M
--- |
open-llm-leaderboard/details_kashif__stack-llama-2 | ---
pretty_name: Evaluation run of kashif/stack-llama-2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kashif/stack-llama-2](https://huggingface.co/kashif/stack-llama-2) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kashif__stack-llama-2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T20:19:04.146812](https://huggingface.co/datasets/open-llm-leaderboard/details_kashif__stack-llama-2/blob/main/results_2023-09-22T20-19-04.146812.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.00036305608931194434,\n \"f1\": 0.05443896812080537,\n\
\ \"f1_stderr\": 0.0012685965060744062,\n \"acc\": 0.4202036533620397,\n\
\ \"acc_stderr\": 0.010294487617119145\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931194434,\n\
\ \"f1\": 0.05443896812080537,\n \"f1_stderr\": 0.0012685965060744062\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10007581501137225,\n \
\ \"acc_stderr\": 0.008266274528685624\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n\
\ }\n}\n```"
repo_url: https://huggingface.co/kashif/stack-llama-2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|arc:challenge|25_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T20_19_04.146812
path:
- '**/details_harness|drop|3_2023-09-22T20-19-04.146812.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T20-19-04.146812.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T20_19_04.146812
path:
- '**/details_harness|gsm8k|5_2023-09-22T20-19-04.146812.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T20-19-04.146812.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hellaswag|10_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T20_19_04.146812
path:
- '**/details_harness|winogrande|5_2023-09-22T20-19-04.146812.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T20-19-04.146812.parquet'
- config_name: results
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- results_2023-08-29T07:07:44.494010.parquet
- split: 2023_09_22T20_19_04.146812
path:
- results_2023-09-22T20-19-04.146812.parquet
- split: latest
path:
- results_2023-09-22T20-19-04.146812.parquet
---
# Dataset Card for Evaluation run of kashif/stack-llama-2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/kashif/stack-llama-2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [kashif/stack-llama-2](https://huggingface.co/kashif/stack-llama-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kashif__stack-llama-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T20:19:04.146812](https://huggingface.co/datasets/open-llm-leaderboard/details_kashif__stack-llama-2/blob/main/results_2023-09-22T20-19-04.146812.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931194434,
"f1": 0.05443896812080537,
"f1_stderr": 0.0012685965060744062,
"acc": 0.4202036533620397,
"acc_stderr": 0.010294487617119145
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931194434,
"f1": 0.05443896812080537,
"f1_stderr": 0.0012685965060744062
},
"harness|gsm8k|5": {
"acc": 0.10007581501137225,
"acc_stderr": 0.008266274528685624
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552667
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
arvisioncode/donut-funsd | ---
dataset_info:
features:
- name: ground_truth
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 25994868.0
num_examples: 147
- name: test
num_bytes: 9129119.0
num_examples: 47
- name: validation
num_bytes: 9129119.0
num_examples: 47
download_size: 44182619
dataset_size: 44253106.0
---
# Dataset Card for "donut-funsd"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Saxo/cn_ko_translation_tech_social_science_linkbricks_single_dataset_with_prompt_text_huggingface | ---
license: apache-2.0
---
|
Saxo/Korean-Corpus-From-Various-Task-1 | ---
license: apache-2.0
---
|
bond005/sberdevices_golos_10h_crowd_noised_2db | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 995134360.1
num_examples: 7550
- name: test
num_bytes: 1286746658.04
num_examples: 9896
- name: validation
num_bytes: 100885232.0
num_examples: 755
download_size: 2303855481
dataset_size: 2382766250.14
---
# Dataset Card for "sberdevices_golos_10h_crowd_noised_2db"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CorticalStack__mistral-7b-metamathqa-sft | ---
pretty_name: Evaluation run of CorticalStack/mistral-7b-metamathqa-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CorticalStack/mistral-7b-metamathqa-sft](https://huggingface.co/CorticalStack/mistral-7b-metamathqa-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CorticalStack__mistral-7b-metamathqa-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-16T16:13:27.633644](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-metamathqa-sft/blob/main/results_2024-02-16T16-13-27.633644.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6109833546650115,\n\
\ \"acc_stderr\": 0.033243551615710396,\n \"acc_norm\": 0.6155806742206059,\n\
\ \"acc_norm_stderr\": 0.03392554504142213,\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882452,\n \"mc2\": 0.44733823807616424,\n\
\ \"mc2_stderr\": 0.014684832855657028\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5546075085324232,\n \"acc_stderr\": 0.014523987638344083,\n\
\ \"acc_norm\": 0.5844709897610921,\n \"acc_norm_stderr\": 0.01440136664121639\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6065524795857399,\n\
\ \"acc_stderr\": 0.0048751626991216535,\n \"acc_norm\": 0.8044214299940251,\n\
\ \"acc_norm_stderr\": 0.0039583479345203345\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n\
\ \"acc_stderr\": 0.025560604721022884,\n \"acc_norm\": 0.7193548387096774,\n\
\ \"acc_norm_stderr\": 0.025560604721022884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.03053289223393202,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03053289223393202\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n\
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608302,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608302\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399324,\n \"\
acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399324\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.04266416363352167,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.04266416363352167\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459752,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459752\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.014805384478371165,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.014805384478371165\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2916201117318436,\n\
\ \"acc_stderr\": 0.015201032512520425,\n \"acc_norm\": 0.2916201117318436,\n\
\ \"acc_norm_stderr\": 0.015201032512520425\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.026041766202717156,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.026041766202717156\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4230769230769231,\n\
\ \"acc_stderr\": 0.01261820406658839,\n \"acc_norm\": 0.4230769230769231,\n\
\ \"acc_norm_stderr\": 0.01261820406658839\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824873,\n\
\ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824873\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6143790849673203,\n \"acc_stderr\": 0.019691459052354022,\n \
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.019691459052354022\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.030116426296540603,\n\
\ \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.030116426296540603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882452,\n \"mc2\": 0.44733823807616424,\n\
\ \"mc2_stderr\": 0.014684832855657028\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205191\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40181956027293403,\n \
\ \"acc_stderr\": 0.013504357787494035\n }\n}\n```"
repo_url: https://huggingface.co/CorticalStack/mistral-7b-metamathqa-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|arc:challenge|25_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|gsm8k|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hellaswag|10_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T16-13-27.633644.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T16-13-27.633644.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- '**/details_harness|winogrande|5_2024-02-16T16-13-27.633644.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-16T16-13-27.633644.parquet'
- config_name: results
data_files:
- split: 2024_02_16T16_13_27.633644
path:
- results_2024-02-16T16-13-27.633644.parquet
- split: latest
path:
- results_2024-02-16T16-13-27.633644.parquet
---
# Dataset Card for Evaluation run of CorticalStack/mistral-7b-metamathqa-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-metamathqa-sft](https://huggingface.co/CorticalStack/mistral-7b-metamathqa-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CorticalStack__mistral-7b-metamathqa-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T16:13:27.633644](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-metamathqa-sft/blob/main/results_2024-02-16T16-13-27.633644.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6109833546650115,
"acc_stderr": 0.033243551615710396,
"acc_norm": 0.6155806742206059,
"acc_norm_stderr": 0.03392554504142213,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882452,
"mc2": 0.44733823807616424,
"mc2_stderr": 0.014684832855657028
},
"harness|arc:challenge|25": {
"acc": 0.5546075085324232,
"acc_stderr": 0.014523987638344083,
"acc_norm": 0.5844709897610921,
"acc_norm_stderr": 0.01440136664121639
},
"harness|hellaswag|10": {
"acc": 0.6065524795857399,
"acc_stderr": 0.0048751626991216535,
"acc_norm": 0.8044214299940251,
"acc_norm_stderr": 0.0039583479345203345
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.025560604721022884,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.025560604721022884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03053289223393202,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03053289223393202
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608302,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608302
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.016595259710399324,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.016595259710399324
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808503,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808503
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.04266416363352167,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.04266416363352167
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459752,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459752
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371165,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371165
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2916201117318436,
"acc_stderr": 0.015201032512520425,
"acc_norm": 0.2916201117318436,
"acc_norm_stderr": 0.015201032512520425
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.026041766202717156,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.026041766202717156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.01261820406658839,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.01261820406658839
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824873,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.019691459052354022,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.019691459052354022
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.030116426296540603,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.030116426296540603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882452,
"mc2": 0.44733823807616424,
"mc2_stderr": 0.014684832855657028
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.011705697565205191
},
"harness|gsm8k|5": {
"acc": 0.40181956027293403,
"acc_stderr": 0.013504357787494035
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
factored/us_patent_hub | ---
dataset_info:
features:
- name: description
dtype: string
- name: abstract
dtype: string
- name: cpc
dtype: int64
splits:
- name: train
num_bytes: 3999772077.0
num_examples: 130164
download_size: 1624852320
dataset_size: 3999772077.0
---
# Dataset Card for "us_patent_hub"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuhaotian/LLaVA-CC3M-Pretrain-595K | ---
license: other
language:
- en
pretty_name: LLaVA CC3M Pretrain 595K
---
# LLaVA Visual Instruct CC3M 595K Pretrain Dataset Card
## Dataset details
**Dataset type:**
LLaVA Visual Instruct CC3M Pretrain 595K is a subset of CC-3M dataset, filtered with a more balanced concept coverage distribution.
Captions are also associated with [BLIP synthetic caption](https://github.com/salesforce/BLIP#pre-training-datasets-download) for reference.
It is constructed for the pretraining stage for feature alignment in visual instruction tuning.
We aim to build large multimodal towards GPT-4 vision/language capability.
**Dataset date:**
LLaVA Visual Instruct CC3M Pretrain 595K was created in April 2023.
**Dataset structure:**
- `chat.json` contains the multimodal synthesized conversation from the image-caption pairs, by adding randomly selected instructions like: "Describe this image". It is used for pretraining in LLaVA. We use the raw CC-3M caption as the default answer.
- `metadata.json` contains the meta data of the image index in CC-3M, image file name, image URL, original CC-3M caption, synthetic BLIP caption. Note that ~10% of the samples are not associated with BLIP caption yet in this release.
- `images.zip` contains all raw images of the filtered subset from CC-3M. **Important notice: Upon the request from the community, as ~15% images of the original CC-3M dataset are no longer accessible, we upload `images.zip` for better reproducing our work in research community. It should not be used for any other purpose. The use of these images must comply with the CC-3M license. This may be taken down when requested by the original CC-3M dataset owner or owners of the referenced images.**
**Paper or resources for more information:**
https://llava-vl.github.io/
**License:**
Must comply with license of [CC-3M](https://github.com/google-research-datasets/conceptual-captions/blob/master/LICENSE), [BLIP](https://github.com/salesforce/BLIP/blob/main/LICENSE.txt) (if you use their synthetic caption).
CC-3M
The dataset may be freely used for any purpose, although acknowledgement of
Google LLC ("Google") as the data source would be appreciated. The dataset is
provided "AS IS" without any warranty, express or implied. Google disclaims all
liability for any damages, direct or indirect, resulting from the use of the
dataset.
**Where to send questions or comments about the model:**
https://github.com/haotian-liu/LLaVA/issues
## Intended use
**Primary intended uses:**
The primary use of LLaVA is research on large multimodal models and chatbots.
**Primary intended users:**
The primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence. |
grsilva/ptWikidataRebel | ---
license: mit
---
|
Otherwa/animes_dataset | ---
license: openrail
---
|
ontocord/onto4all | ---
dataset_info:
features:
- name: conversation
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: float64
- name: type
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 49071644
num_examples: 26278
download_size: 22408363
dataset_size: 49071644
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: other
language:
- vi
size_categories:
- 10K<n<100K
---
# Onto4All: Enhancing Multilingual Conversational AI
We are excited to introduce **Onto4All** which is a subsample of other open source performant conversational datasets. We start with a carefully curated subset of the **OpenHermes-2.5-Viet dataset**, co-created by **[@qnguyen3](https://twitter.com/stablequan)** and **[@teknium](https://twitter.com/teknium)**. This dataset is specifically designed to support the training and evaluation of Multilingual language models, such as Vistral-7B-chat and VinaLlama-7B-chat, and is derived from our Supervised Fine-Tuning (SFT) data.
We have included Vietnamese here, but will add more languages.
# Version:
This is 0.1.
# Dataset Description
For version 0.1, we striped down our OpenHermes-2.5 with 1M data items subsampled to 25,000+ and translated to Vietnamese. The subsampling was done by applying TopicBERT and using consine similarity embedding based clustering to slim down the original dataset.
This dataset includes the following topics:
- Role Playing
- Context-Aware Question Answering
- Agent Prompting
- Coding
- and more...
The majority of the answers in this dataset were generated by state-of-the-art language models, **GPT-4** and **Claude 2.1**, ensuring high-quality and diverse responses.
We claim no rights to the output generated by other models, and merely redistribute a translation of the orgiginal OpenHermes-2.5 dataset.
The intention of the dataset is to be used to research multi-lingual abilities of AI models in order to ensure fairness and equal access.
# License
Our own modifications to the OpenHeremes-2.5 dataset is licensed under CC-0. However, the original OpenHermes-2.5 dataset is licensed as set forth [here](https://huggingface.co/datasets/teknium/OpenHermes-2.5).
Moreover, you should look over the GPT4 and Claude terms of use and consult your advisors for the applicability of using the data for your purposes.
# Acknowledgement
We would like to express our sincere gratitude to the following individuals and organizations for their invaluable contributions:
- **@teknium**, **@autometa**, and other open-source dataset creators for the remarkable OpenHermes-2.5 Dataset, which serves as the foundation for ViHermes-25K.
- **@qnguyen3** and **@nampdn-ai** for their dedication in translating and regenerating the answers in Vietnamese, making this dataset accessible to the Vietnamese AI community.
We are committed to fostering collaboration and advancement in the field of natural language processing, and we believe that **Onto4All** will be a valuable resource for researchers and developers alike.
# Notice
Please be aware that you use this dataset at your own risk and we disclaim all liabilities with respect to the data, including any harmful or bias responses. This dataset has **NOT** been filtered for safety.
Moreover, we disclaim all warranties, whether express or implied and all laibilities with respect to infringment, fitness for a particular puprpose, or otherwise.
```
@article{Onto4All2024,
title={Onto4All: Enhancing Multilingual Conversational AI},
author={Nguyen, Q., },
journal={GitHub repository},
year={2024},
publisher={HuggingFace Datasets}
}
``` |
sdadasfgdfgfdg/Pyke | ---
license: openrail
---
|
joey234/mmlu-college_biology-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 75299
num_examples: 144
download_size: 49015
dataset_size: 75299
---
# Dataset Card for "mmlu-college_biology-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_46 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 123828998
num_examples: 12724
download_size: 35687915
dataset_size: 123828998
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_46"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/data-standardized_cluster_17_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4428976
num_examples: 1857
download_size: 1900676
dataset_size: 4428976
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_17_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kunishou/oasst1-89k-ja | ---
license: apache-2.0
language:
- ja
configs:
- config_name: default
data_files:
- split: train
path: "oasst1_89k_ja_20231027.json"
---

This dataset was created by automatically translating "OpenAssistant/oasst1" into Japanese.
The "ng_translation" flag indicates that the translation was not successful, and "1" means that the translation failed.
Therefore, for data with "1", "text" and "text_en" contain the same text.
**Update:**
- 2023/11/12
oasst1-89k-jaをチャット形式に変換した[oasst1-chat-44k-ja](https://huggingface.co/datasets/kunishou/oasst1-chat-44k-ja)を公開しました。
- 2023/10/21
自動翻訳によるコード関連データの翻訳誤り2000箇所程度を手動で修正しました。
**<details><summary>修正イメージを表示</summary><div>**
- 修正前
```
もちろん!これは、Flask Webフレームワークを使用して文字列を提供する単純なAPIエンドポイントを作成するPythonスクリプトの例です。
フラスコ輸入フラスコから
app = flask(__name__)
@app.route( '/')
def hello_world():
「こんにちは、世界!」を返します
__name__ == '__main__'の場合:
app.run()
このスクリプトでは、最初にフラスコモジュールからフラスコクラスをインポートします。次に、__Name__変数を使用してアプリケーションの名前を指定するフラスコクラスの新しいインスタンスを作成します。
```
- 修正後
```
もちろん!これは、Flask Webフレームワークを使用して文字列を提供する単純なAPIエンドポイントを作成するPythonスクリプトの例です。
from flask import Flask
app = Flask(__name__)
@app.route('/')
def hello_world():
return 'Hello, world!'
if __name__ == '__main__':
app.run()
このスクリプトでは、最初にフラスコモジュールからフラスコクラスをインポートします。次に、__Name__変数を使用してアプリケーションの名前を指定するフラスコクラスの新しいインスタンスを作成します。
```
</div></details>
以下のコードを用いることで、 Instruction と Output (prompterの命令とassistantの回答)の形式に変換することができます。
ファインチューニングで使用する場合はこちらのコードで変換して下さい。
変換コード参考
https://github.com/h2oai/h2o-llmstudio/blob/5ebfd3879e226b4e1afd0a0b45eb632e60412129/app_utils/utils.py#L1888
```python
pip install datasets
```
```python
from datasets import load_dataset
import pandas as pd
import os
import json
# oasst1のオリジナルデータのロード
ds = load_dataset("OpenAssistant/oasst1")
train = ds["train"].to_pandas()
val = ds["validation"].to_pandas()
df_origin = pd.concat([train, val], axis=0).reset_index(drop=True)
# oasst1日本語翻訳データの読み込み
df_ja = pd.read_json("oasst1_ja_89k.json")
# oasst1のオリジナルデータと日本語翻訳データのマージ
df = pd.merge(df_origin, df_ja[["message_id", "text_ja"]], on="message_id", how="left").copy()
df["text"] = df["text_ja"]
df_assistant = df[(df.role == "assistant")].copy()
df_prompter = df[(df.role == "prompter")].copy()
df_prompter = df_prompter.set_index("message_id")
df_assistant["output"] = df_assistant["text"].values
inputs = []
parent_ids = []
for _, row in df_assistant.iterrows():
input = df_prompter.loc[row.parent_id]
inputs.append(input.text)
parent_ids.append(input.parent_id)
df_assistant["instruction"] = inputs
df_assistant["parent_id"] = parent_ids
df_assistant = df_assistant[
["instruction", "output", "message_id", "parent_id", "lang", "rank"]
].rename(columns={"message_id": "id"})
# 翻訳タスクのみデータに異常があるので除外
df_assistant2 = df_assistant[~df_assistant["instruction"].str.contains("翻訳")]
# これ以下でjsonファイルへ書き出し---------------
learn_datas = []
input_list = []
for n in range(len(df_assistant2)):
learn_data = {
"instruction": str(df_assistant2.iloc[n, 0]),
"input": "",
"output": ""
}
input_list.append(df_assistant2.iloc[n, 0])
learn_data["input"] = ""
learn_data["output"] = str(df_assistant2.iloc[n, 1])
learn_datas.append(learn_data)
json_learn_data = json.dumps(learn_datas, indent=4, ensure_ascii=False)
with open('oasst1_ja_converted.json', 'w', encoding="utf-8") as f:
f.write(json_learn_data)
```
oasst1-ja-89k Repository
https://github.com/kunishou/oasst1-89k-ja
OpenAssistant/oasst1
https://huggingface.co/datasets/OpenAssistant/oasst1 |
lucadiliello/naturalquestionsshortqa | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: key
dtype: string
- name: labels
list:
- name: end
sequence: int64
- name: start
sequence: int64
splits:
- name: train
num_bytes: 100706304
num_examples: 104071
- name: validation
num_bytes: 12941478
num_examples: 12836
download_size: 61870589
dataset_size: 113647782
---
# Dataset Card for "naturalquestionsshortqa"
Split taken from the MRQA 2019 Shared Task, formatted and filtered for Question Answering. For the original dataset, have a look [here](https://huggingface.co/datasets/mrqa). |
branles14/ultrachat-uncensored_full | ---
license: cc-by-nc-4.0
---
# Ultrachat-Uncensored
Ultrachat-Uncensored is a variant of the original Ultrachat dataset available at [Ultrachat](https://huggingface.co/datasets/stingning/ultrachat), where any examples where the bot's messages match the specified terms are removed. These terms can be found in [filters.txt](https://huggingface.co/datasets/branles14/ultrachat-uncensored/blob/main/filters.txt).
This process was carried out in an attempt to neutralize the bot's responses by excluding particular terms. The goal is to foster more constructive and neutral conversations with the bot.
## Dataset Variants
There are two versions of this dataset available:
1. [Ultrachat-Uncensored](https://huggingface.co/datasets/branles14/ultrachat-uncensored): In this version, the filter is only applied to the bot's messages.
2. [Ultrachat-Uncensored Full](https://huggingface.co/datasets/branles14/ultrachat-uncensored_full): In this version, the filter is applied to both human and bot messages for a more thorough filtering process.
## Purpose
The idea behind removing certain terms is to create a chatbot that feels more neutral in its interactions. The intended outcome is to ensure that the bot engages in unbiased and fair dialogue, maintaining a neutral stance on controversial topics. This neutrality is expected to make conversations with the bot more enjoyable and less prone to unnecessary confrontations or misunderstandings.
Please note that while we have made an effort to filter specific terms, we recommend using the dataset responsibly, acknowledging that no filtering process can be perfect.
## Contribution
Contributions to enhance this project are welcome! Feel free to open issues or submit pull requests for improving the filter or suggesting new enhancements.
Enjoy using Ultrachat-Uncensored, and we look forward to your constructive feedback and suggestions. |
316usman/thematic1cembed | ---
license: bsd
dataset_info:
features:
- name: text
dtype: string
- name: thematic
dtype: string
- name: sub-thematic
dtype: string
- name: country
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 1356242968
num_examples: 1880948
download_size: 438647696
dataset_size: 1356242968
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Lununlu/Applio-RVC-Fork | ---
license: apache-2.0
---
|
aqua_rat | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
- expert-generated
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- multiple-choice-qa
paperswithcode_id: aqua-rat
pretty_name: Algebra Question Answering with Rationales
dataset_info:
- config_name: raw
features:
- name: question
dtype: string
- name: options
sequence: string
- name: rationale
dtype: string
- name: correct
dtype: string
splits:
- name: train
num_bytes: 42333059
num_examples: 97467
- name: test
num_bytes: 116759
num_examples: 254
- name: validation
num_bytes: 118616
num_examples: 254
download_size: 25568676
dataset_size: 42568434
- config_name: tokenized
features:
- name: question
dtype: string
- name: options
sequence: string
- name: rationale
dtype: string
- name: correct
dtype: string
splits:
- name: train
num_bytes: 46493643
num_examples: 97467
- name: test
num_bytes: 126263
num_examples: 254
- name: validation
num_bytes: 128853
num_examples: 254
download_size: 26429873
dataset_size: 46748759
configs:
- config_name: raw
data_files:
- split: train
path: raw/train-*
- split: test
path: raw/test-*
- split: validation
path: raw/validation-*
default: true
- config_name: tokenized
data_files:
- split: train
path: tokenized/train-*
- split: test
path: tokenized/test-*
- split: validation
path: tokenized/validation-*
---
# Dataset Card for AQUA-RAT
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/deepmind/AQuA](https://github.com/deepmind/AQuA)
- **Repository:** [https://github.com/deepmind/AQuA](https://github.com/deepmind/AQuA)
- **Paper:** [https://arxiv.org/pdf/1705.04146.pdf](https://arxiv.org/pdf/1705.04146.pdf)
### Dataset Summary
A large-scale dataset consisting of approximately 100,000 algebraic word problems.
The solution to each question is explained step-by-step using natural language.
This data is used to train a program generation model that learns to generate the explanation,
while generating the program that solves the question.
### Supported Tasks and Leaderboards
### Languages
en
## Dataset Structure
### Data Instances
```
{
"question": "A grocery sells a bag of ice for $1.25, and makes 20% profit. If it sells 500 bags of ice, how much total profit does it make?",
"options": ["A)125", "B)150", "C)225", "D)250", "E)275"],
"rationale": "Profit per bag = 1.25 * 0.20 = 0.25\nTotal profit = 500 * 0.25 = 125\nAnswer is A.",
"correct": "A"
}
```
### Data Fields
- `question` : (str) A natural language definition of the problem to solve
- `options` : (list(str)) 5 possible options (A, B, C, D and E), among which one is correct
- `rationale` : (str) A natural language description of the solution to the problem
- `correct` : (str) The correct option
### Data Splits
| | Train | Valid | Test |
| ----- | ------ | ----- | ---- |
| Examples | 97467 | 254 | 254 |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
Copyright 2017 Google Inc.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
### Citation Information
```
@article{ling2017program,
title={Program induction by rationale generation: Learning to solve and explain algebraic word problems},
author={Ling, Wang and Yogatama, Dani and Dyer, Chris and Blunsom, Phil},
journal={ACL},
year={2017}
}
```
### Contributions
Thanks to [@arkhalid](https://github.com/arkhalid) for adding this dataset. |
joey234/mmlu-us_foreign_policy-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 28060
num_examples: 100
download_size: 18869
dataset_size: 28060
---
# Dataset Card for "mmlu-us_foreign_policy-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ravithejads/ms_marco_hi_mr_te_ta_ur | ---
dataset_info:
features:
- name: answers
sequence: string
- name: passages
sequence:
- name: is_selected
dtype: int32
- name: passage_text
dtype: string
- name: url
dtype: string
- name: query
dtype: string
- name: query_id
dtype: int32
- name: query_type
dtype: string
- name: wellFormedAnswers
sequence: string
- name: query_hi
dtype: string
- name: answers_hi
dtype: string
- name: passage_text_hi
sequence: string
- name: query_mr
dtype: string
- name: passage_text_mr
sequence: string
- name: answers_mr
sequence: string
- name: query_te
dtype: string
- name: passage_text_te
sequence: string
- name: answers_te
sequence: string
- name: query_ta
dtype: string
- name: passage_text_ta
sequence: string
- name: answers_ta
sequence: string
- name: query_ur
dtype: string
- name: passage_text_ur
sequence: string
- name: answers_ur
sequence: string
splits:
- name: test
num_bytes: 477860001
num_examples: 9650
download_size: 165465297
dataset_size: 477860001
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
tillschwoerer/tagesschau | ---
annotations_creators:
- found
language:
- de
language_creators:
- found
license: []
multilinguality:
- monolingual
pretty_name: tagesschau
size_categories:
- 1K<n<10K
source_datasets: []
tags:
- newspapers
- germany
- '2022'
task_categories:
- text-classification
task_ids:
- topic-classification
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': amerika
'1': asien
'2': finanzen
'3': innenpolitik
'4': sportschau
'5': unternehmen
'6': verbraucher
splits:
- name: train
num_bytes: 4400114
num_examples: 1200
- name: validation
num_bytes: 555716
num_examples: 150
- name: test
num_bytes: 555716
num_examples: 150
download_size: 3412287
dataset_size: 5511546
---
# Dataset Card for "tagesschau"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
importpandas/openwebdata-6M | ---
license: apache-2.0
---
|
arefm/third_experiment_data | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: id
dtype: string
- name: texts
dtype: string
- name: noisy_audio_0
dtype: audio
- name: noisy_audio_10
dtype: audio
- name: noisy_audio_20
dtype: audio
- name: noisy_audio_30
dtype: audio
- name: noisy_audio_40
dtype: audio
splits:
- name: train
num_bytes: 275715715.0
num_examples: 200
download_size: 267505861
dataset_size: 275715715.0
---
# Dataset Card for "third_experiment_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JohnTeddy3/midjourney-v5-202304 | ---
license: apache-2.0
task_categories:
- text-to-image
- image-to-text
language:
- en
tags:
- midjourney
---
# midjourney-v5-202304-clean
## 简介 Brief Introduction
转载自wanng/midjourney-v5-202304-clean
非官方的,爬取自midjourney v5的2023年4月的数据,一共1701420条。
Unofficial, crawled from midjourney v5 for April 2023, 1,701,420 pairs in total.
## 数据集信息 Dataset Information
原始项目地址:https://huggingface.co/datasets/tarungupta83/MidJourney_v5_Prompt_dataset
我做了一些清洗,清理出了两个文件:
- ori_prompts_df.parquet (1,255,812对,midjourney的四格图)

- upscaled_prompts_df.parquet (445,608对,使用了高清指令的图,这意味着这个图更受欢迎。)

Original project address: https://huggingface.co/datasets/tarungupta83/MidJourney_v5_Prompt_dataset
I did some cleaning and cleaned out two files:
- ori_prompts_df.parquet (1,255,812 pairs, midjourney's four-frame diagrams)
- upscaled_prompts_df.parquet (445,608 pairs, graphs that use the Upscale command, which means this one is more popular.)
|
omegalabsinc/omega-multimodal | ---
license: mit
---
# OMEGA Labs Bittensor Subnet: Multimodal Dataset for AGI Research
[](https://omegatron.ai)
## Introduction
The OMEGA Labs Bittensor Subnet Dataset is a groundbreaking resource for accelerating Artificial General Intelligence (AGI) research and development. This dataset, powered by the Bittensor decentralized network, aims to be the world's largest multimodal dataset, capturing the vast landscape of human knowledge and creation.
With over 1 million hours of footage and 30 million+ 2-minute video clips, the OMEGA Labs dataset will offer unparalleled scale and diversity, covering 50+ scenarios and 15,000+ action phrases. By leveraging state-of-the-art models to translate video components into a unified latent space, this dataset enables the development of powerful AGI models and has the potential to transform various industries.
## Key Features
- 🌍 **Constant Stream of Fresh Data**: The OMEGA dataset is constantly updated with new entries scraped by miners on Bittensor's decentralized AI network. We estimate that within a few weeks, we can get to 5M+ new videos added daily.
- 📈 **Rich Data**: In addition to scale, we are focused on scraping relevant, high quality data. Using [ImageBind](https://imagebind.metademolab.com/demo) embeddings of the submitted videos and corresponding captions, miners are rewarded based on three factors:
- **Diversity**: The further away each new datapoint is from existing datapoints (judged by embedding cosine similarity), the higher the reward
- **Richness**: The more detailed the caption (judged by cosine similarity between video and submitted caption), the higher the reward
- **Relevance**: Miners are asked to scrape data pertaining to handpicked categories, pertinent for building video understanding and training world models.
- 🧠 **Latent Representations**: ImageBind embeddings for the video, audio, and caption are pre-computed
- 🤖 **Empowering Digital Agents**: Enables the development of intelligent agents that can navigate complex workflows and assist users across platforms.
- 📊 **Flexible Metadata**: Filter the dataset to find clips relevant to topics you would like to train on or filter by your desired cosine similarities
## Dataset Structure
The OMEGA Labs Bittensor Subnet Dataset consists of the following columns:
- `video_id`: Unique identifier for each video clip.
- `youtube_id`: The original YouTube video ID.
- `description`: Description of the video content.
- `views`: Number of views the original YouTube video has received.
- `start_time`: Start time of the video clip within the original video.
- `end_time`: End time of the video clip within the original video.
- `video_embed`: Latent representation of the video content.
- `audio_embed`: Latent representation of the audio content.
- `description_embed`: Latent representation of the video description.
- `description_relevance_score`: Relevance score of the video description to the content.
- `query_relevance_score`: Relevance score of the video to the search query.
- `query`: The search query used to retrieve the video.
- `submitted_at`: Timestamp of when the video was added to the dataset.
## Applications
The OMEGA Labs Bittensor Subnet Dataset empowers researchers and developers to push the boundaries of AGI by providing a vast and diverse resource for training and testing multimodal models. Some potential applications include:
- **Unified Representation Learning**: Train powerful models that can learn unified representations across modalities.
- **Any-to-Any Models**: Develop models capable of translating between different modalities, such as generating videos from text descriptions or vice versa.
- **Digital Agents**: Create intelligent agents that can navigate complex workflows and assist users across platforms.
- **Immersive Gaming**: Build realistic gaming environments with rich physics and interactions.
- **Video Understanding**: Advance the state-of-the-art in video processing tasks such as transcription, motion analysis, object detection, and emotion recognition.
## Say hi!
If you're interested in getting in touch, reach out to us on [Twitter](https://twitter.com/omegalabsai)!
You can also visit our [Github](https://github.com/omegalabsinc/omegalabs-bittensor-subnet/tree/main) to learn more about how our scraping is done!
And if you'd like to learn more about Bittensor, join the [Discord](https://discord.gg/6yZpQ9KV)! |
SergeiZu/databricks-llama2-15k | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7003731
num_examples: 15011
download_size: 4295323
dataset_size: 7003731
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_FelixChao__WestSeverus-10.7B | ---
pretty_name: Evaluation run of FelixChao/WestSeverus-10.7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FelixChao/WestSeverus-10.7B](https://huggingface.co/FelixChao/WestSeverus-10.7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__WestSeverus-10.7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T17:05:37.081553](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__WestSeverus-10.7B/blob/main/results_2024-02-01T17-05-37.081553.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6541756570563236,\n\
\ \"acc_stderr\": 0.032145025906151356,\n \"acc_norm\": 0.6555840410982634,\n\
\ \"acc_norm_stderr\": 0.0327981150369556,\n \"mc1\": 0.554467564259486,\n\
\ \"mc1_stderr\": 0.017399335280140343,\n \"mc2\": 0.7230180462238235,\n\
\ \"mc2_stderr\": 0.01453935209482768\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6885665529010239,\n \"acc_stderr\": 0.013532472099850947,\n\
\ \"acc_norm\": 0.7218430034129693,\n \"acc_norm_stderr\": 0.013094469919538805\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.68442541326429,\n \
\ \"acc_stderr\": 0.004637944965914612,\n \"acc_norm\": 0.874726150169289,\n\
\ \"acc_norm_stderr\": 0.003303526413123495\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443865,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443865\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474887,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474887\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723292,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723292\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206858,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206858\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323385,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323385\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40670391061452515,\n\
\ \"acc_stderr\": 0.01642881191589886,\n \"acc_norm\": 0.40670391061452515,\n\
\ \"acc_norm_stderr\": 0.01642881191589886\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n\
\ \"acc_stderr\": 0.02521804037341063,\n \"acc_norm\": 0.729903536977492,\n\
\ \"acc_norm_stderr\": 0.02521804037341063\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4576271186440678,\n\
\ \"acc_stderr\": 0.012724296550980188,\n \"acc_norm\": 0.4576271186440678,\n\
\ \"acc_norm_stderr\": 0.012724296550980188\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233278,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233278\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.554467564259486,\n\
\ \"mc1_stderr\": 0.017399335280140343,\n \"mc2\": 0.7230180462238235,\n\
\ \"mc2_stderr\": 0.01453935209482768\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8271507498026835,\n \"acc_stderr\": 0.010626964529971868\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.621683093252464,\n \
\ \"acc_stderr\": 0.013358407831777112\n }\n}\n```"
repo_url: https://huggingface.co/FelixChao/WestSeverus-10.7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|arc:challenge|25_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|gsm8k|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hellaswag|10_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T17-05-37.081553.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T17-05-37.081553.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- '**/details_harness|winogrande|5_2024-02-01T17-05-37.081553.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T17-05-37.081553.parquet'
- config_name: results
data_files:
- split: 2024_02_01T17_05_37.081553
path:
- results_2024-02-01T17-05-37.081553.parquet
- split: latest
path:
- results_2024-02-01T17-05-37.081553.parquet
---
# Dataset Card for Evaluation run of FelixChao/WestSeverus-10.7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/WestSeverus-10.7B](https://huggingface.co/FelixChao/WestSeverus-10.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__WestSeverus-10.7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T17:05:37.081553](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__WestSeverus-10.7B/blob/main/results_2024-02-01T17-05-37.081553.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6541756570563236,
"acc_stderr": 0.032145025906151356,
"acc_norm": 0.6555840410982634,
"acc_norm_stderr": 0.0327981150369556,
"mc1": 0.554467564259486,
"mc1_stderr": 0.017399335280140343,
"mc2": 0.7230180462238235,
"mc2_stderr": 0.01453935209482768
},
"harness|arc:challenge|25": {
"acc": 0.6885665529010239,
"acc_stderr": 0.013532472099850947,
"acc_norm": 0.7218430034129693,
"acc_norm_stderr": 0.013094469919538805
},
"harness|hellaswag|10": {
"acc": 0.68442541326429,
"acc_stderr": 0.004637944965914612,
"acc_norm": 0.874726150169289,
"acc_norm_stderr": 0.003303526413123495
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443865,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443865
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474887,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474887
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121427,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121427
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206858,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206858
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323385,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323385
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40670391061452515,
"acc_stderr": 0.01642881191589886,
"acc_norm": 0.40670391061452515,
"acc_norm_stderr": 0.01642881191589886
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341063,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4576271186440678,
"acc_stderr": 0.012724296550980188,
"acc_norm": 0.4576271186440678,
"acc_norm_stderr": 0.012724296550980188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233278,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233278
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.554467564259486,
"mc1_stderr": 0.017399335280140343,
"mc2": 0.7230180462238235,
"mc2_stderr": 0.01453935209482768
},
"harness|winogrande|5": {
"acc": 0.8271507498026835,
"acc_stderr": 0.010626964529971868
},
"harness|gsm8k|5": {
"acc": 0.621683093252464,
"acc_stderr": 0.013358407831777112
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
hackathon-pln-es/spanish-to-quechua | ---
language:
- es
- qu
task_categories:
- translation
task:
- translation
---
# Spanish to Quechua
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Structure](#dataset-structure)
- [Dataset Creation](#dataset-creation)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [team members](#team-members)
## Dataset Description
This dataset is a recopilation of webs and others datasets that shows in [dataset creation section](#dataset-creation). This contains translations from spanish (es) to Qechua of Ayacucho (qu).
## Dataset Structure
### Data Fields
- es: The sentence in Spanish.
- qu: The sentence in Quechua of Ayacucho.
### Data Splits
- train: To train the model (102 747 sentences).
- Validation: To validate the model during training (12 844 sentences).
- test: To evaluate the model when the training is finished (12 843 sentences).
## Dataset Creation
### Source Data
This dataset has generated from:
- "Mundo Quechua" by "Ivan Acuña" - [available here](https://mundoquechua.blogspot.com/2006/07/frases-comunes-en-quechua.html)
- "Kuyakuykim (Te quiero): Apps con las que podrías aprender quechua" by "El comercio" - [available here](https://elcomercio.pe/tecnologia/actualidad/traductor-frases-romanticas-quechua-noticia-467022-noticia/)
- "Piropos y frases de amor en quechua" by "Soy Quechua" - [available here](https://www.soyquechua.org/2019/12/palabras-en-quechua-de-amor.html)
- "Corazón en quechua" by "Soy Quechua" - [available here](https://www.soyquechua.org/2020/05/corazon-en-quechua.html)
- "Oraciones en Español traducidas a Quechua" by "Tatoeba" - [available here](https://tatoeba.org/es/sentences/search?from=spa&query=&to=que)
- "AmericasNLP 2021 Shared Task on Open Machine Translation" by "americasnlp2021" - [available here](https://github.com/AmericasNLP/americasnlp2021/tree/main/data/quechua-spanish/parallel_data/es-quy)
### Data cleaning
- The dataset was manually cleaned during compilation, as some words of one language were related to several words of the other language.
## Considerations for Using the Data
This is a first version of the dataset, we expected improve it over time and especially to neutralize the biblical themes.
## Team members
- [Sara Benel](https://huggingface.co/sbenel)
- [Jose Vílchez](https://huggingface.co/JCarlos) |
CyberHarem/newcastle_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of newcastle/ニューカッスル/纽卡斯尔 (Azur Lane)
This is the dataset of newcastle/ニューカッスル/纽卡斯尔 (Azur Lane), containing 80 images and their tags.
The core tags of this character are `long_hair, breasts, bangs, brown_hair, maid_headdress, blue_eyes, large_breasts, medium_breasts, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 80 | 99.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/newcastle_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 80 | 64.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/newcastle_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 177 | 124.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/newcastle_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 80 | 92.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/newcastle_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 177 | 166.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/newcastle_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/newcastle_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, black_dress, black_footwear, blue_ribbon, blush, full_body, hair_ribbon, holding_umbrella, looking_at_viewer, maid, shoes, sidelocks, solo, black_pantyhose, cannon, closed_umbrella, rigging, simple_background, smile, turret, white_background, closed_mouth, detached_sleeves, jewelry, juliet_sleeves, machinery, sleeveless_dress, standing_on_one_leg, white_apron, wrist_cuffs, bare_shoulders, black_hair, blue_gemstone, cleavage_cutout, copyright_name, frilled_dress, hair_ornament, hand_on_hip, hand_up, leg_up, puffy_short_sleeves, shiny, skirt_hold, two_side_up, very_long_hair |
| 1 | 9 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, maid, smile, solo, frills, holding_umbrella, blush, closed_umbrella, black_dress, closed_mouth, white_apron, juliet_sleeves, simple_background, white_background |
| 2 | 11 |  |  |  |  |  | 1boy, blush, hetero, 1girl, mosaic_censoring, pussy, solo_focus, maid, penis, sex, sweat, vaginal, looking_at_viewer, nipples, dress, black_panties, breasts_out, garter_straps, black_thighhighs, detached_sleeves, open_mouth, spread_legs, anus, ass, black_hair, blue_ribbon, pov, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_dress | black_footwear | blue_ribbon | blush | full_body | hair_ribbon | holding_umbrella | looking_at_viewer | maid | shoes | sidelocks | solo | black_pantyhose | cannon | closed_umbrella | rigging | simple_background | smile | turret | white_background | closed_mouth | detached_sleeves | jewelry | juliet_sleeves | machinery | sleeveless_dress | standing_on_one_leg | white_apron | wrist_cuffs | bare_shoulders | black_hair | blue_gemstone | cleavage_cutout | copyright_name | frilled_dress | hair_ornament | hand_on_hip | hand_up | leg_up | puffy_short_sleeves | shiny | skirt_hold | two_side_up | very_long_hair | cleavage | frills | 1boy | hetero | mosaic_censoring | pussy | solo_focus | penis | sex | sweat | vaginal | nipples | dress | black_panties | breasts_out | garter_straps | black_thighhighs | open_mouth | spread_legs | anus | ass | pov |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:-----------------|:--------------|:--------|:------------|:--------------|:-------------------|:--------------------|:-------|:--------|:------------|:-------|:------------------|:---------|:------------------|:----------|:--------------------|:--------|:---------|:-------------------|:---------------|:-------------------|:----------|:-----------------|:------------|:-------------------|:----------------------|:--------------|:--------------|:-----------------|:-------------|:----------------|:------------------|:-----------------|:----------------|:----------------|:--------------|:----------|:---------|:----------------------|:--------|:-------------|:--------------|:-----------------|:-----------|:---------|:-------|:---------|:-------------------|:--------|:-------------|:--------|:------|:--------|:----------|:----------|:--------|:----------------|:--------------|:----------------|:-------------------|:-------------|:--------------|:-------|:------|:------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | | | X | | | X | X | X | | | X | | | X | | X | X | | X | X | | | X | | | | X | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | | | X | X | | | | X | X | | | | | | | | | X | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CATIE-AQ/orange_sum_fr_prompt_summarization | ---
language:
- fr
license: cc-by-sa-4.0
size_categories:
- 100K<n<1M
task_categories:
- summarization
tags:
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- orange_sum
---
# orange_sum_fr_prompt_summarization
## Summary
**orange_sum_fr_prompt_summarization** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **683,228** rows that can be used for a summary task.
The original data (without prompts) comes from the dataset [orange_sum](https://huggingface.co/datasets/orange_sum) by Eddine et al.
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
28 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'Résumer le texte suivant : "'+document+'"',
'Résume le texte suivant : "'+document+'"',
'Résumez le texte suivant : "'+document+'"',
'Résumer le texte suivant en quelques mots : "'+document+'"',
'Résume le texte suivant en quelques mots : "'+document+'"',
'Résumez le texte suivant en quelques mots : "'+document+'"',
"Condenser le texte à l'essentiel :" +document,
"Condense le texte à l'essentiel :" +document,
"Condensez le texte à l'essentiel :" +document,
'"'+document+' Rédiger un résumé du texte ci-dessus :',
'"'+document+' Rédige un résumé du texte ci-dessus :',
'"'+document+' Rédigez un résumé du texte ci-dessus :',
'Premièrement, lire le texte ci-dessous. \n\n "'+document+'"\n\n Maintenant, rédiger un court résumé.',
'Premièrement, lis le texte ci-dessous. \n\n "'+document+'"\n\n Maintenant, rédige un court résumé.',
'Premièrement, lisez le texte ci-dessous. \n\n "'+document+'"\n\n Maintenant, rédigez un court résumé.',
'Article : "'+document+'"/n Résumé : ',
'"'+document+' Comment reformuler cela en quelques mots ?',
'"'+document+' Comment peux-tu reformuler cela en quelques mots ?',
'"'+document+' Comment pouvez-vous reformuler cela en quelques mots ?',
'Résumer ce document : "'+document+'" Résumé :',
'Résume ce document : "'+document+'" Résumé :',
'Résumez ce document : "'+document+'" Résumé :',
'"'+document+' Compte tenu du document ci-dessus, écrire une phrase pour le résumer :',
'"'+document+' Compte tenu du document ci-dessus, écris une phrase pour le résumer :',
'"'+document+' Compte tenu du document ci-dessus, écrivez une phrase pour le résumer :',
'"'+document+' Rédiger un résumé du texte ci-dessus : ',
'"'+document+' Rédige un résumé du texte ci-dessus : ',
'"'+document+' Rédigez un résumé du texte ci-dessus : '
```
### Features used in the prompts
In the prompt list above, `document` and `targets` have been constructed from:
```
orange_sum = load_dataset('orange_sum','abstract')
document = orange_sum['train'][i]['text']
targets = orange_sum['train'][i]['summary']
```
# Splits
- `train` with 599,228 samples
- `valid` with 42,000 samples
- `test` with 42,000 samples
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/orange_sum_fr_prompt_summarization")
```
# Citation
## Original data
> @article{eddine2020barthez,
title={BARThez: a Skilled Pretrained French Sequence-to-Sequence Model},
author={Eddine, Moussa Kamal and Tixier, Antoine J-P and Vazirgiannis, Michalis},
journal={arXiv preprint arXiv:2010.12321},
year={2020}
}
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
CC-BY-SA-4.0 |
anan-2024/twitter_dataset_1713104545 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 134020
num_examples: 332
download_size: 72715
dataset_size: 134020
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
transformersegmentation/CHILDES | ---
configs:
- config_name: "English"
default: True
data_files:
- split: train
path: Eng-NA/train.csv
- split: valid
path: Eng-NA/valid.csv
- config_name: "French"
data_files:
- split: train
path: French/train.csv
- split: valid
path: French/valid.csv
- config_name: "German"
data_files:
- split: train
path: German/train.csv
- split: valid
path: German/valid.csv
- config_name: "Spanish"
data_files:
- split: train
path: Spanish/train.csv
- split: valid
path: Spanish/valid.csv
- config_name: "Dutch"
data_files:
- split: train
path: Dutch/train.csv
- split: valid
path: Dutch/valid.csv
- config_name: "Mandarin"
data_files:
- split: train
path: Mandarin/train.csv
- split: valid
path: Mandarin/valid.csv
- config_name: "Japanese"
data_files:
- split: train
path: Japanese/train.csv
- split: valid
path: Japanese/valid.csv
- config_name: "Cantonese"
data_files:
- split: train
path: Cantonese/train.csv
- split: valid
path: Cantonese/valid.csv
- config_name: "Estonian"
data_files:
- split: train
path: Estonian/train.csv
- split: valid
path: Estonian/valid.csv
- config_name: "Croatian"
data_files:
- split: train
path: Croatian/train.csv
- split: valid
path: Croatian/valid.csv
- config_name: "Danish"
data_files:
- split: train
path: Danish/train.csv
- split: valid
path: Danish/valid.csv
- config_name: "Basque"
data_files:
- split: train
path: Basque/train.csv
- split: valid
path: Basque/valid.csv
- config_name: "Hungarian"
data_files:
- split: train
path: Hungarian/train.csv
- split: valid
path: Hungarian/valid.csv
- config_name: "Turkish"
data_files:
- split: train
path: Turkish/train.csv
- split: valid
path: Turkish/valid.csv
- config_name: "Farsi"
data_files:
- split: train
path: Farsi/train.csv
- split: valid
path: Farsi/valid.csv
- config_name: "Icelandic"
data_files:
- split: train
path: Icelandic/train.csv
- split: valid
path: Icelandic/valid.csv
- config_name: "Indonesian"
data_files:
- split: train
path: Indonesian/train.csv
- split: valid
path: Indonesian/valid.csv
language:
- en
- de
- fr
- es
- nl
- cmn
- ja
- yue
- et
- hr
- da
- eu
- hu
- tr
- fa
- is
- id
tags:
- language modeling
- cognitive modeling
pretty_name: Phonemized Child Directed Speech
size_categories:
- 100K<n<1M
---
# Phonemized Child Directed Speech Dataset
This dataset contains utterance downloaded from CHILDES which have been pre-processed and converted to phonemic transcriptions by this [processing script](https://github.com/codebyzeb/Corpus-Phonemicizers). Many of the columns from CHILDES have been preserved in case they may be useful for experiments (e.g. number of morphemes, part-of-speech tags, etc.). The key columns added by the processing script are as follows:
| Column | Description |
|:----|:-----|
| `is_child`| Whether the utterance was spoken by a child or not. Note that this is set to `False` for all utterances in this dataset, but the processing script has the ability to preserve child utterances.|
| `processed_gloss`| The pre-processed orthographic utterance. This includes lowercasing, fixing English spelling and adding punctuation marks. This is based on the [AOChildes](https://github.com/UIUCLearningLanguageLab/AOCHILDES) preprocessing.|
| `phonemized_utterance`| A phonemic transcription of the utterance, space-separated with word boundaries marked with the `WORD_BOUNDARY` token.|
| `language_code`| Language code used for producing the phonemic transcriptions. May not match the `language` column provided by CHILDES (e.g. Eng-NA and Eng-UK tend to be transcribed with eng-us and eng-gb). |
| `character_split_utterance`| A space separated transcription of the utterance, produced simply by splitting the processed gloss by character. This is intended to have a very similar format to `phonemized_utterance` for studies comparing phonetic to orthographic transcriptions. |
The last two columns are designed for training character-based (phoneme-based) language models using a simple tokenizer that splits around whitespace. The `processed_gloss` column is suitable for word-based (or subword-based) language models with standard tokenizers.
Note that the data has been sorted by the `target_child_age` column, which stores child age in months. This can be used to limit the training data according to a maximum child age, if you wish.
Each subset of the data is split into a training split containing most of the utterances and an in-distribution validation split containing 10,000 utterances. The following languages are included (ordered by number of phonemes):
| Language | Description | Speakers | Utterances | Words | Phonemes
|:----|:-----|:-----|:----|:-----|:-----|
| English | Taken from 44 corpora in Eng-NA collection of CHILDES and phonemized using language code `en-us`. | 2,692 | 1,646,954 | 7,090,066 | 21,932,139
| German | Taken from 10 corpora in German collection of CHILDES and phonemized using language code `ge`. | 627 | 850,888 | 3,893,168 | 14,058,836
| Indonesian | Taken from 1 corpus in EastAsian/Indonesian collection of CHILDES and phonemized using language code `id`. | 389 | 534,469 | 1,587,526 | 6,367,721
| Mandarin | Taken from 15 corpora in Chinese/Mandarin collection of CHILDES and phonemized using a [pinyin to IPA convertor](https://github.com/stefantaubert/pinyin-to-ipa/tree/master). | 15 | 883 | 326,759 | 1,511,851 | 6,106,770
| French | Taken from 11 corpora in French collection of CHILDES and phonemized using language code `fr-fr`. | 722 | 432,133 | 1,995,063 | 5,510,523
| Spanish | Taken from 18 corpora in Spanish collection of CHILDES and phonemized using language code `es`. | 562 | 286,462 | 1,266,366 | 4,511,261
| Japanese | Taken from 9 corpora in Japanese collection of CHILDES and phonemized using segments with language `japanese`. | 320 | 412,079 | 1,113,194 | 4,346,638
| Dutch | Taken from 5 corpora in DutchAfricaans/Dutch collection of CHILDES and phonemized using language code `nl`. | 86 | 297,497 | 1,246,006 | 4,034,742
| Estonian | Taken from 9 corpora in Other/Estonian collection of CHILDES and phonemized using language code `et`. | 118 | 103,343 | 544,680 | 2,347,066
| Cantonese | Taken from 2 corpora in Chinese/Cantonese collection of CHILDES and phonemized by converting from jyutping to IPA using the [pingyam database](https://github.com/kfcd/pingyam/tree/master). | 80 | 136,727 | 591,314 | 2,118,731
| Croatian | Taken from 1 corpus in Slavic/Croatian collection of CHILDES and phonemized using language code `hr`. | 51 | 55,284 | 214,921 | 813,619
| Icelandic | Taken from 2 corpora in Scandinavian/Icelandic collection of CHILDES and phonemized using language code `is`. | 15 | 50,657 | 197,519 | 772,952
| Danish | Taken from 1 corpus in Scandinavian/Danish collection of CHILDES and phonemized using language code `da`. | 1 | 48,976 | 192,527 | 579,375
| Basque | Taken from 2 corpora in Other/Basque collection of CHILDES and phonemized using language code `eu`. | 150 | 36,614 | 135,866 | 565,633
| Hungarian | Taken from 3 corpora in Other/Hungarian collection of CHILDES and phonemized using language code `hu`. | 65 | 31,633 | 116,917 | 478,444
| Turkish | Taken from 2 corpora in Other/Turkish collection of CHILDES and phonemized using language code `tr`. | 35 | 14,487 | 43,823 | 230,737
| Farsi | Taken from 2 corpora in Other/Farsi collection of CHILDES and phonemized using language code `fa-latn`. | 23 | 13,467 | 28,080 | 116,081
|
OBF/books | ---
dataset_info:
features:
- name: text
dtype: string
- name: meta
dtype: string
- name: red_pajama_subset
dtype: string
splits:
- name: train
num_bytes: 104991304946
num_examples: 205744
download_size: 62495685361
dataset_size: 104991304946
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SIA86/WaterFlowCountersRecognition | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: annotations
sequence:
- name: bbox
sequence: int64
length: 4
- name: area
dtype: int64
- name: segmentation
sequence:
sequence: int64
- name: name
dtype:
class_label:
names:
'0': value_a
'1': value_b
'2': serial
- name: rotated
dtype:
class_label:
names:
'0': '0'
'1': '90'
'2': '180'
'3': '270'
config_name: WFCR_full
splits:
- name: train
num_bytes: 937884
num_examples: 1644
- name: test
num_bytes: 239710
num_examples: 412
download_size: 46791554
dataset_size: 1177594
---
|
lleticiasilvaa/test-CNPJ-sample-EN | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: tables
sequence: string
- name: schema
dtype: string
- name: example_values
dtype: string
- name: schema_with_example_values
dtype: string
splits:
- name: train
num_bytes: 39391
num_examples: 15
download_size: 17087
dataset_size: 39391
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tasksource/race-c | ---
task_categories:
- question-answering
- multiple-choice
language:
- en
---
Race-C : additional data for race (high school/middle school) but for college level
https://github.com/mrcdata/race-c
```bib
@InProceedings{pmlr-v101-liang19a,
title={A New Multi-choice Reading Comprehension Dataset for Curriculum Learning},
author={Liang, Yichan and Li, Jianheng and Yin, Jian},
booktitle={Proceedings of The Eleventh Asian Conference on Machine Learning},
pages={742--757},
year={2019}
}
``` |
dtrejopizzo/gsm8k | ---
license: apache-2.0
---
|
distilled-from-one-sec-cv12/chunk_167 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1238243828
num_examples: 241279
download_size: 1266007035
dataset_size: 1238243828
---
# Dataset Card for "chunk_167"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
H4438/full-text-universities | ---
dataset_info:
features:
- name: id
dtype: int64
- name: date
dtype: string
- name: alias
dtype: string
- name: university
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 9075720
num_examples: 684
download_size: 3651077
dataset_size: 9075720
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Dữ liệu các trường đại học dưới dạng văn bản
# Lưu ý:
- Bên trong dữ liệu này có table
# Một vài thông tin mà mỗi dòng có thể lưu
Mỗi dòng khả năng cao sẽ có cấu trúc dưới đây.\
Tham khảo qua: [HUST](https://reviewedu.net/school/truong-dai-hoc-bach-khoa-ha-noi-hust) \
Tuy là cấu trúc chung nhưng không phải dòng nào cũng có đầy đủ thông tin hoặc cùng key
```json
{
"thông tin chung": "thông tin liên lạc + website + giới thiệu trường",
"Mục tiêu phát triển": "mục tiêu phát triển + cam kết của trường",
"Lịch sử phát triển": "lịch sử phát triển của trường từ thành lập đến giờ",
"Đội ngũ cán bộ": "",
"Cơ sở vật chất": "",
"Đối tượng và phạm vi tuyển sinh": "",
"Thời gian xét tuyển": "có thể là table",
"Phương thức tuyển sinh": "có thể là table",
"Các ngành tuyển sinh": "có thể là table",
"Học phí": "có thể là table",
"Điểm chuẩn": "có thể là table",
"Những quyền lợi của sinh viên khi theo học tại Trường": "",
"Cơ hội ra trường": "cơ hội nghề nghiệp tương lai",
"tổng kết": "table tổng kết"
}
``` |
ormeshein/captcha | ---
license: mit
---
|
xzyu/leuven-images | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
tags:
- cognitive science
size_categories:
- 100M<n<1B
--- |
hlillemark/flores200_devtest_mt5-3b-flores200-baseline | ---
dataset_info:
features:
- name: id
dtype: int32
- name: source_lang
dtype: string
- name: target_lang
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: prediction
dtype: string
- name: chrf_unreduced
dtype: string
splits:
- name: devtest
num_bytes: 371166281
num_examples: 500000
download_size: 253943015
dataset_size: 371166281
---
# Dataset Card for "flores200_devtest_mt5-3b-flores200-baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FelippeAlex6/LippeVoicesCharactersPeople | ---
license: cc-by-2.5
---
|
soddokayo/kmou-2016klp | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 3732093
num_examples: 2928
- name: dev
num_bytes: 459796
num_examples: 366
- name: test
num_bytes: 449770
num_examples: 366
download_size: 951800
dataset_size: 4641659
---
# Dataset Card for "kmou-2016klp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amphora/annual_reports_us.ko.ja | ---
configs:
- config_name: Annual Report US
data_files:
- split: train
path: Annual_Report_US.csv
- config_name: Annual Report Japan
data_files:
- split: train
path: Annual_Report_Japan.csv
- config_name: Annual Report Korea
data_files:
- split: train
path: Annual_Report_Korea.csv
license: mit
---
|
Davlan/aya_african_dataset | ---
license: apache-2.0
---
|
autoevaluate/autoeval-eval-futin__guess-en_3-fcaae9-2012466613 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/guess
eval_info:
task: text_zero_shot_classification
model: facebook/opt-6.7b
metrics: []
dataset_name: futin/guess
dataset_config: en_3
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-6.7b
* Dataset: futin/guess
* Config: en_3
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
jan-hq/wizardLM_evol_instruct_v2_binarized | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 302019360.3
num_examples: 128700
- name: test
num_bytes: 33557706.7
num_examples: 14300
download_size: 161132486
dataset_size: 335577067.0
---
# Dataset Card for "wizardLM_evol_instruct_v2_binarized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tverous/SemEval-Sample | ---
dataset_info:
features:
- name: conv_uttr_id
dtype: string
- name: conversation
dtype: string
- name: sentence
dtype: string
- name: emotion
dtype: int64
- name: cause_utterance_ID
sequence: string
splits:
- name: train
num_bytes: 13354056
num_examples: 13619
download_size: 1080587
dataset_size: 13354056
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "SemEval-Sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-elementary_mathematics-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 63081
num_examples: 378
download_size: 38060
dataset_size: 63081
---
# Dataset Card for "mmlu-elementary_mathematics-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NEUDM/absa-quad | ---
task_categories:
- text-generation
language:
- en
size_categories:
- 1K<n<10K
---
> 上述数据集为ABSA(Aspect-Based Sentiment Analysis)领域数据集,基本形式为从句子中抽取:方面术语、方面类别(术语类别)、术语在上下文中情感极性以及针对该术语的观点词,不同数据集抽取不同的信息,这点在jsonl文件的“instruction”键中有分别提到,在此我将其改造为了生成任务,需要模型按照一定格式生成抽取结果。
#### 以acos数据集中抽取的jsonl文件一条数据举例:
```
{
"task_type": "generation",
"dataset": "acos",
"input": ["the computer has difficulty switching between tablet and computer ."],
"output": "[['computer', 'laptop usability', 'negative', 'difficulty']]",
"situation": "none",
"label": "",
"extra": "",
"instruction": "
Task: Extracting aspect terms and their corresponding aspect categories, sentiment polarities, and opinion words.
Input: A sentence
Output: A list of 4-tuples, where each tuple contains the extracted aspect term, its aspect category, sentiment polarity, and opinion words (if any). Supplement: \"Null\" means that there is no occurrence in the sentence.
Example:
Sentence: \"Also it's not a true SSD drive in there but eMMC, which makes a difference.\"
Output: [['SSD drive', 'hard_disc operation_performance', 'negative', 'NULL']]'
"
}
```
> 此处未设置label和extra,在instruction中以如上所示的字符串模板,并给出一个例子进行one-shot,ABSA领域数据集(absa-quad,acos,arts,aste-data-v2,mams,semeval-2014,semeval-2015,semeval-2016,towe)每个数据集对应instruction模板相同,内容有细微不同,且部分数据集存在同一数据集不同数据instruction内容不同的情况。
#### 原始数据集
- 数据[链接](https://github.com/IsakZhang/ABSA-QUAD)
- Paper: [Aspect Sentiment Quad Prediction as Paraphrase Generation](https://aclanthology.org/2021.emnlp-main.726.pdf)
- 说明:原始数据集由Rest15和Rest16两个文件夹的数据组成,本次改造我将两个数据集的数据合并并区分为train、validation与test
#### 当前SOTA
*数据来自[论文](https://arxiv.org/abs/2305.09193)*
- 评价指标:F1 score
- SOTA模型:E2H-large (Rest15上F1 Score:**52.39** , Rest16上F1 Score:**61.86**)
- Paper:[Easy-to-Hard Learning for Information Extraction](https://arxiv.org/pdf/2305.09193.pdf)
- 说明:该论文来自[Google Scholar](https://scholar.google.com/scholar?hl=zh-CN&as_sdt=2005&sciodt=0,5&cites=13359676136585163616&scipsc=&q=&scisbd=1)检索到的引用ABSA-QUAD原论文的论文之一,我比较了2023年的一些论文工作后筛选了一个最优指标以及模型。
|
ShenRuililin/MedicalQnA | ---
license: mit
---
|
senhorsapo/amyb | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.