datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
ybelkada/oasst1-tiny-subset | ---
dataset_info:
features:
- name: messages
dtype: string
splits:
- name: train
num_bytes: 59104494.0
num_examples: 39663
- name: test
num_bytes: 6567166.0
num_examples: 4407
download_size: 38767143
dataset_size: 65671660.0
---
# Dataset Card for "oasst1-tiny-subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/neumarco_fa_train | ---
pretty_name: '`neumarco/fa/train`'
viewer: false
source_datasets: ['irds/neumarco_fa']
task_categories:
- text-retrieval
---
# Dataset Card for `neumarco/fa/train`
The `neumarco/fa/train` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/neumarco#neumarco/fa/train).
# Data
This dataset provides:
- `queries` (i.e., topics); count=808,731
- `qrels`: (relevance assessments); count=532,761
- `docpairs`; count=269,919,004
- For `docs`, use [`irds/neumarco_fa`](https://huggingface.co/datasets/irds/neumarco_fa)
This dataset is used by: [`neumarco_fa_train_judged`](https://huggingface.co/datasets/irds/neumarco_fa_train_judged)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/neumarco_fa_train', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/neumarco_fa_train', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
docpairs = load_dataset('irds/neumarco_fa_train', 'docpairs')
for record in docpairs:
record # {'query_id': ..., 'doc_id_a': ..., 'doc_id_b': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
Anushka1304/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B | ---
pretty_name: Evaluation run of krevas/LDCC-Instruct-Llama-2-ko-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [krevas/LDCC-Instruct-Llama-2-ko-13B](https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T06:55:19.126017](https://huggingface.co/datasets/open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B/blob/main/results_2023-10-09T06-55-19.126017.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5140887884293746,\n\
\ \"acc_stderr\": 0.034831195333324204,\n \"acc_norm\": 0.5180581384469735,\n\
\ \"acc_norm_stderr\": 0.03481277047428223,\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.01539211880501503,\n \"mc2\": 0.37999611805412853,\n\
\ \"mc2_stderr\": 0.013428724763055466\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5392491467576792,\n \"acc_stderr\": 0.014566303676636588,\n\
\ \"acc_norm\": 0.5674061433447098,\n \"acc_norm_stderr\": 0.014478005694182526\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6096395140410277,\n\
\ \"acc_stderr\": 0.004868341056566223,\n \"acc_norm\": 0.8156741684923322,\n\
\ \"acc_norm_stderr\": 0.0038695723555438196\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.04065771002562605,\n\
\ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.04065771002562605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5358490566037736,\n \"acc_stderr\": 0.030693675018458003,\n\
\ \"acc_norm\": 0.5358490566037736,\n \"acc_norm_stderr\": 0.030693675018458003\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.042207736591714506,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.042207736591714506\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873506,\n \"\
acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873506\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5935483870967742,\n\
\ \"acc_stderr\": 0.027941727346256304,\n \"acc_norm\": 0.5935483870967742,\n\
\ \"acc_norm_stderr\": 0.027941727346256304\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512567,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512567\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6262626262626263,\n \"acc_stderr\": 0.034468977386593325,\n \"\
acc_norm\": 0.6262626262626263,\n \"acc_norm_stderr\": 0.034468977386593325\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.03201867122877794,\n\
\ \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.03201867122877794\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4461538461538462,\n \"acc_stderr\": 0.02520357177302833,\n \
\ \"acc_norm\": 0.4461538461538462,\n \"acc_norm_stderr\": 0.02520357177302833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.0324371805513741,\n \
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.0324371805513741\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6678899082568808,\n \"acc_stderr\": 0.02019268298542333,\n \"\
acc_norm\": 0.6678899082568808,\n \"acc_norm_stderr\": 0.02019268298542333\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.032568505702936484,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.032568505702936484\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236436,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236436\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6919831223628692,\n \"acc_stderr\": 0.0300523893356057,\n \
\ \"acc_norm\": 0.6919831223628692,\n \"acc_norm_stderr\": 0.0300523893356057\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041018,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041018\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n\
\ \"acc_stderr\": 0.04732332615978813,\n \"acc_norm\": 0.6018518518518519,\n\
\ \"acc_norm_stderr\": 0.04732332615978813\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7113665389527458,\n\
\ \"acc_stderr\": 0.016203792703197776,\n \"acc_norm\": 0.7113665389527458,\n\
\ \"acc_norm_stderr\": 0.016203792703197776\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5520231213872833,\n \"acc_stderr\": 0.02677299065336182,\n\
\ \"acc_norm\": 0.5520231213872833,\n \"acc_norm_stderr\": 0.02677299065336182\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.014465893829859924,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.014465893829859924\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.02855582751652878,\n\
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.02855582751652878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.027316847674192703,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.027316847674192703\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380157,\n\
\ \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380157\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4106910039113429,\n\
\ \"acc_stderr\": 0.012564871542534353,\n \"acc_norm\": 0.4106910039113429,\n\
\ \"acc_norm_stderr\": 0.012564871542534353\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.0302114796091216,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.0302114796091216\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.553921568627451,\n \"acc_stderr\": 0.020109864547181357,\n \
\ \"acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.020109864547181357\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5346938775510204,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.5346938775510204,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\
\ \"acc_stderr\": 0.03220024104534204,\n \"acc_norm\": 0.7064676616915423,\n\
\ \"acc_norm_stderr\": 0.03220024104534204\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.01539211880501503,\n \"mc2\": 0.37999611805412853,\n\
\ \"mc2_stderr\": 0.013428724763055466\n }\n}\n```"
repo_url: https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|arc:challenge|25_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hellaswag|10_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T06-55-19.126017.parquet'
- config_name: results
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- results_2023-10-09T06-55-19.126017.parquet
- split: latest
path:
- results_2023-10-09T06-55-19.126017.parquet
---
# Dataset Card for Evaluation run of krevas/LDCC-Instruct-Llama-2-ko-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [krevas/LDCC-Instruct-Llama-2-ko-13B](https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T06:55:19.126017](https://huggingface.co/datasets/open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B/blob/main/results_2023-10-09T06-55-19.126017.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5140887884293746,
"acc_stderr": 0.034831195333324204,
"acc_norm": 0.5180581384469735,
"acc_norm_stderr": 0.03481277047428223,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.01539211880501503,
"mc2": 0.37999611805412853,
"mc2_stderr": 0.013428724763055466
},
"harness|arc:challenge|25": {
"acc": 0.5392491467576792,
"acc_stderr": 0.014566303676636588,
"acc_norm": 0.5674061433447098,
"acc_norm_stderr": 0.014478005694182526
},
"harness|hellaswag|10": {
"acc": 0.6096395140410277,
"acc_stderr": 0.004868341056566223,
"acc_norm": 0.8156741684923322,
"acc_norm_stderr": 0.0038695723555438196
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5358490566037736,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.5358490566037736,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04122728707651282,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04122728707651282
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.042207736591714506,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.042207736591714506
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.0433913832257986,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.0433913832257986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.024026846392873506,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.024026846392873506
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5935483870967742,
"acc_stderr": 0.027941727346256304,
"acc_norm": 0.5935483870967742,
"acc_norm_stderr": 0.027941727346256304
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512567,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512567
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6262626262626263,
"acc_stderr": 0.034468977386593325,
"acc_norm": 0.6262626262626263,
"acc_norm_stderr": 0.034468977386593325
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4461538461538462,
"acc_stderr": 0.02520357177302833,
"acc_norm": 0.4461538461538462,
"acc_norm_stderr": 0.02520357177302833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066485,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.0324371805513741,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.0324371805513741
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6678899082568808,
"acc_stderr": 0.02019268298542333,
"acc_norm": 0.6678899082568808,
"acc_norm_stderr": 0.02019268298542333
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.032568505702936484,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.032568505702936484
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236436,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236436
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6919831223628692,
"acc_stderr": 0.0300523893356057,
"acc_norm": 0.6919831223628692,
"acc_norm_stderr": 0.0300523893356057
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.043285772152629715,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.043285772152629715
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.04236964753041018,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.04236964753041018
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978813,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978813
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7113665389527458,
"acc_stderr": 0.016203792703197776,
"acc_norm": 0.7113665389527458,
"acc_norm_stderr": 0.016203792703197776
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5520231213872833,
"acc_stderr": 0.02677299065336182,
"acc_norm": 0.5520231213872833,
"acc_norm_stderr": 0.02677299065336182
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859924,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859924
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.02855582751652878,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.02855582751652878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.027316847674192703,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.027316847674192703
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.026915003011380157,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.026915003011380157
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4106910039113429,
"acc_stderr": 0.012564871542534353,
"acc_norm": 0.4106910039113429,
"acc_norm_stderr": 0.012564871542534353
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.0302114796091216,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.0302114796091216
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.020109864547181357,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.020109864547181357
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5346938775510204,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.5346938775510204,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.03220024104534204,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.03220024104534204
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.01539211880501503,
"mc2": 0.37999611805412853,
"mc2_stderr": 0.013428724763055466
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_zarakiquemparte__kuchiki-1.1-l2-7b | ---
pretty_name: Evaluation run of zarakiquemparte/kuchiki-1.1-l2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [zarakiquemparte/kuchiki-1.1-l2-7b](https://huggingface.co/zarakiquemparte/kuchiki-1.1-l2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zarakiquemparte__kuchiki-1.1-l2-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T05:14:37.796518](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__kuchiki-1.1-l2-7b/blob/main/results_2023-10-25T05-14-37.796518.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.25786493288590606,\n\
\ \"em_stderr\": 0.004479992336423503,\n \"f1\": 0.33612416107382703,\n\
\ \"f1_stderr\": 0.004456179772038806,\n \"acc\": 0.3893274364772528,\n\
\ \"acc_stderr\": 0.009141619357749198\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.25786493288590606,\n \"em_stderr\": 0.004479992336423503,\n\
\ \"f1\": 0.33612416107382703,\n \"f1_stderr\": 0.004456179772038806\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04700530705079606,\n \
\ \"acc_stderr\": 0.0058298983559372\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.7316495659037096,\n \"acc_stderr\": 0.012453340359561195\n\
\ }\n}\n```"
repo_url: https://huggingface.co/zarakiquemparte/kuchiki-1.1-l2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|arc:challenge|25_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T05_14_37.796518
path:
- '**/details_harness|drop|3_2023-10-25T05-14-37.796518.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T05-14-37.796518.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T05_14_37.796518
path:
- '**/details_harness|gsm8k|5_2023-10-25T05-14-37.796518.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T05-14-37.796518.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hellaswag|10_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T05_14_37.796518
path:
- '**/details_harness|winogrande|5_2023-10-25T05-14-37.796518.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T05-14-37.796518.parquet'
- config_name: results
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- results_2023-09-22T00-09-37.890921.parquet
- split: 2023_10_25T05_14_37.796518
path:
- results_2023-10-25T05-14-37.796518.parquet
- split: latest
path:
- results_2023-10-25T05-14-37.796518.parquet
---
# Dataset Card for Evaluation run of zarakiquemparte/kuchiki-1.1-l2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/zarakiquemparte/kuchiki-1.1-l2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [zarakiquemparte/kuchiki-1.1-l2-7b](https://huggingface.co/zarakiquemparte/kuchiki-1.1-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zarakiquemparte__kuchiki-1.1-l2-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T05:14:37.796518](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__kuchiki-1.1-l2-7b/blob/main/results_2023-10-25T05-14-37.796518.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.25786493288590606,
"em_stderr": 0.004479992336423503,
"f1": 0.33612416107382703,
"f1_stderr": 0.004456179772038806,
"acc": 0.3893274364772528,
"acc_stderr": 0.009141619357749198
},
"harness|drop|3": {
"em": 0.25786493288590606,
"em_stderr": 0.004479992336423503,
"f1": 0.33612416107382703,
"f1_stderr": 0.004456179772038806
},
"harness|gsm8k|5": {
"acc": 0.04700530705079606,
"acc_stderr": 0.0058298983559372
},
"harness|winogrande|5": {
"acc": 0.7316495659037096,
"acc_stderr": 0.012453340359561195
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AIML-TUDA/v-lol-trains | ---
license: cc-by-4.0
task_categories:
- image-classification
language:
- en
tags:
- vlol
- v-lol
- visual logical learning
- reasoning
- visual reasoning
- logical reasoning
- ILP
- Symbolic AI
- logic
- Inductive logic programming
pretty_name: 'V-LoL: A Diagnostic Dataset for Visual Logical Learning'
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage** https://sites.google.com/view/v-lol/home
- **Repository** https://github.com/ml-research/vlol-dataset-gen
- **Paper** https://arxiv.org/abs/2306.07743
- **Point of Contact:** lukas_henrik.helff@tu-darmstadt.de
### Dataset Summary
This diagnostic dataset ([website](https://sites.google.com/view/v-lol), [paper](https://doi.org/10.48550/arXiv.2306.07743)) is specifically designed to evaluate the visual logical learning capabilities of machine learning models.
It offers a seamless integration of visual and logical challenges, providing 2D images of complex visual trains,
where the classification is derived from rule-based logic.
The fundamental idea of V-LoL remains to integrate the explicit logical learning tasks of classic symbolic AI benchmarks into visually complex scenes,
creating a unique visual input that retains the challenges and versatility of explicit logic.
In doing so, V-LoL bridges the gap between symbolic AI challenges and contemporary deep learning datasets offering various visual logical learning tasks
that pose challenges for AI models across a wide spectrum of AI research, from symbolic to neural and neuro-symbolic AI.
Moreover, we provide a flexible dataset generator ([GitHub](https://github.com/ml-research/vlol-dataset-gen)) that
empowers researchers to easily exchange or modify the logical rules, thereby enabling the creation of new datasets incorperating novel logical learning challenges.
By combining visual input with logical reasoning, this dataset serves as a comprehensive benchmark for assessing the ability
of machine learning models to learn and apply logical reasoning within a visual context.
### Supported Tasks and Leaderboards
We offer a diverse set of datasets that present challenging AI tasks targeting various reasoning abilities.
The following provides an overview of the available V-LoL challenges and corresponding dataset splits.
| V-LoL Challenges | Train set | Validation set | # of train samples | # of validation samples |
| --- | --- | ----------- | --- | ----------- |
| V-LoL-Trains-TheoryX | V-LoL-Trains-TheoryX | V-LoL-Trains-TheoryX | 10000 | 2000 |
| V-LoL-Trains-Numerical | V-LoL-Trains-Numerical | V-LoL-Trains-Numerical | 10000 | 2000 |
| V-LoL-Trains-Complex | V-LoL-Trains-Complex | V-LoL-Trains-Complex | 10000 | 2000 |
| V-LoL-Blocks-TheoryX | V-LoL-Blocks-TheoryX | V-LoL-Blocks-TheoryX | 10000 | 2000 |
| V-LoL-Blocks-Numerical | V-LoL-Blocks-Numerical | V-LoL-Blocks-Numerical | 10000 | 2000 |
| V-LoL-Blocks-Complex | V-LoL-Blocks-Complex | V-LoL-Blocks-Complex | 10000 | 2000 |
| V-LoL-Trains-TheoryX-len7 | V-LoL-Trains-TheoryX | V-LoL-Trains-TheoryX-len7 | 12000 | 2000 |
| V-LoL-Trains-Numerical-len7 | V-LoL-Trains-Numerical | V-LoL-Trains-Numerical-len7 | 12000 | 2000 |
| V-LoL-Trains-Complex-len7 | V-LoL-Trains-Complex | V-LoL-Trains-Complex-len7 | 12000 | 2000 |
| V-LoL-Random-Trains-TheoryX | V-LoL-Trains-TheoryX | V-LoL-Random-Trains-TheoryX | 12000 | 12000 |
| V-LoL-Random-Blocks-TheoryX | V-LoL-Blocks-TheoryX | V-LoL-Random-Blocks-TheoryX | 12000 | 12000 |
The following gives more detailed explanations of the different V-LoL challenges:
Logical complexity:
- Theory X (marked 'TheoryX'): The train has either a short, closed car or a car with a barrel load is somewhere behind a car with a golden vase load. This rule was originally introduced as "Theory X" in the new East-West Challenge.
- Numerical rule (marked 'Numerical'): The train has a car where its car position equals its number of payloads which equals its number of wheel axles.
- Complex rule (marked 'Complex'): Either, there is a car with a car number which is smaller than its number of wheel axles count and smaller than the number of loads, or there is a short and a long car with the same colour where the position number of the short car is smaller than the number of wheel axles of the long car, or the train has three differently coloured cars. We refer to Tab. 3 in the supp. for more insights on required reasoning properties for each rule.
Visual complexity:
- Realistic train representaions. (marked 'Trains')
- Block representation. (marked 'Blocks')
OOD Trains:
- A train carrying 2-4 cars. (default)
- A train carrying 7 cars. (marked 'len7')
Train attribute distributions:
- Michalski attribute distribution. (default)
- Random attribute distribution. (marked 'Random')
### Languages
English
## Dataset Structure
### Data Instances
```
{
'image': <PIL.PngImagePlugin.PngImageFile image mode=RGBA size=480x270 at 0x1351D0EE0>,
'label': 1
}
```
### Data Fields
The data instances have the following fields:
- image: A PIL.Image.Image object containing the image. Note that when accessing the image column: dataset[0]["image"] the image file is automatically decoded
Decoding of a large number of image files might take a significant amount of time.
Thus it is important to first query the sample index before the "image" column, i.e. dataset[0]["image"] should always be preferred over dataset["image"][0].
- label: an int classification label.
Class labels mapping:
| ID | Class |
| --- | ----------- |
| 0 | Westbound |
| 1 | Eastbound |
### Data Splits
See tasks.
## Dataset Creation
### Curation Rationale
Despite the successes of recent developments in visual AI, different shortcomings still exist;
from missing exact logical reasoning, to abstract generalization abilities, to understanding complex and noisy scenes.
Unfortunately, existing benchmarks, were not designed to capture more than a few of these aspects.
Whereas deep learning datasets focus on visually complex data but simple visual reasoning tasks,
inductive logic datasets involve complex logical learning tasks, however, lack the visual component.
To address this, we propose the visual logical learning dataset, V-LoL, that seamlessly combines visual and logical challenges.
Notably, we introduce the first instantiation of V-LoL, V-LoL-Train, -- a visual rendition of a classic benchmark in symbolic AI, the Michalski train problem.
By incorporating intricate visual scenes and flexible logical reasoning tasks within a versatile framework,
V-LoL-Train provides a platform for investigating a wide range of visual logical learning challenges.
To create new V-LoL challenges, we provide a comprehensive guide and resources in our [GitHub repository](https://github.com/ml-research/vlol-dataset-gen).
The repository offers a collection of tools and code that enable researchers and practitioners to easily generate new V-LoL challenges based on their specific requirements. By referring to our GitHub repository, users can access the necessary documentation, code samples, and instructions to create and customize their own V-LoL challenges.
### Source Data
#### Initial Data Collection and Normalization
The individual datasets are generated using the V-LoL-Train generator. See [GitHub repository](https://github.com/ml-research/vlol-dataset-gen).
#### Who are the source language producers?
See [GitHub repository](https://github.com/ml-research/vlol-dataset-gen).
### Annotations
#### Annotation process
The images are generated in two steps: first sampling a valid symbolic representation of a train and then visualizing it within a 3D scene.
#### Who are the annotators?
Annotations are automatically derived using a python, prolog, and blender pipline. See [GitHub repository](https://github.com/ml-research/vlol-dataset-gen).
### Personal and Sensitive Information
The dataset does not contain personal nor sensitive information.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset has no social impact.
### Discussion of Biases
Please refer to our paper.
### Other Known Limitations
Please refer to our paper.
## Additional Information
### Dataset Curators
Lukas Helff
### Licensing Information
MIT License
### Citation Information
@misc{helff2023vlol,
title={V-LoL: A Diagnostic Dataset for Visual Logical Learning},
author={Lukas Helff and Wolfgang Stammer and Hikaru Shindo and Devendra Singh Dhami and Kristian Kersting},
journal={Dataset available from https://sites.google.com/view/v-lol},
year={2023},
eprint={2306.07743},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
### Contributions
Lukas Helff, Wolfgang Stammer, Hikaru Shindo, Devendra Singh Dhami, Kristian Kersting |
GEM/xsum | ---
annotations_creators:
- none
language_creators:
- unknown
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- unknown
size_categories:
- unknown
source_datasets:
- original
task_categories:
- summarization
task_ids: []
pretty_name: xsum
---
# Dataset Card for GEM/xsum
## Dataset Description
- **Homepage:** n/a
- **Repository:** https://github.com/EdinburghNLP/XSum
- **Paper:** https://www.aclweb.org/anthology/D18-1206
- **Leaderboard:** N/A
- **Point of Contact:** Shashi Narayan
### Link to Main Data Card
You can find the main data card on the [GEM Website](https://gem-benchmark.com/data_cards/xsum).
### Dataset Summary
XSum is an English news summarization dataset where the task is to predict the first sentence of an article from the rest of it.
You can load the dataset via:
```
import datasets
data = datasets.load_dataset('GEM/xsum')
```
The data loader can be found [here](https://huggingface.co/datasets/GEM/xsum).
#### website
n/a
#### paper
[ACL Anthology](https://www.aclweb.org/anthology/D18-1206)
#### authors
Shashi Narayan, Shay B. Cohen, Mirella Lapata (all affiliated with University of Edinburgh at the time of dataset creation)
## Dataset Overview
### Where to find the Data and its Documentation
#### Download
<!-- info: What is the link to where the original dataset is hosted? -->
<!-- scope: telescope -->
[Github](https://github.com/EdinburghNLP/XSum)
#### Paper
<!-- info: What is the link to the paper describing the dataset (open access preferred)? -->
<!-- scope: telescope -->
[ACL Anthology](https://www.aclweb.org/anthology/D18-1206)
#### BibTex
<!-- info: Provide the BibTex-formatted reference for the dataset. Please use the correct published version (ACL anthology, etc.) instead of google scholar created Bibtex. -->
<!-- scope: microscope -->
```
@InProceedings{xsum-emnlp,
author = "Shashi Narayan and Shay B. Cohen and Mirella Lapata",
title = "Don't Give Me the Details, Just the Summary! {T}opic-Aware Convolutional Neural Networks for Extreme Summarization",
booktitle = "Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing ",
year = "2018",
address = "Brussels, Belgium",
}
```
#### Contact Name
<!-- quick -->
<!-- info: If known, provide the name of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
Shashi Narayan
#### Contact Email
<!-- info: If known, provide the email of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
shashinarayan@google.com
#### Has a Leaderboard?
<!-- info: Does the dataset have an active leaderboard? -->
<!-- scope: telescope -->
no
### Languages and Intended Use
#### Multilingual?
<!-- quick -->
<!-- info: Is the dataset multilingual? -->
<!-- scope: telescope -->
no
#### Covered Dialects
<!-- info: What dialects are covered? Are there multiple dialects per language? -->
<!-- scope: periscope -->
Since the source of the dataset are BBC articles, the language is in British English of the variation written by journalists.
#### Covered Languages
<!-- quick -->
<!-- info: What languages/dialects are covered in the dataset? -->
<!-- scope: telescope -->
`English`
#### Whose Language?
<!-- info: Whose language is in the dataset? -->
<!-- scope: periscope -->
Professional journalists
#### License
<!-- quick -->
<!-- info: What is the license of the dataset? -->
<!-- scope: telescope -->
cc-by-sa-4.0: Creative Commons Attribution Share Alike 4.0 International
#### Intended Use
<!-- info: What is the intended use of the dataset? -->
<!-- scope: microscope -->
The dataset is for the task of abstractive summarization in its extreme form, its about summarizing a document in a single sentence. The idea is to create a short, one-sentence news summary answering the question "What is the article about?".
#### Primary Task
<!-- info: What primary task does the dataset support? -->
<!-- scope: telescope -->
Summarization
#### Communicative Goal
<!-- quick -->
<!-- info: Provide a short description of the communicative goal of a model trained for this task on this dataset. -->
<!-- scope: periscope -->
Given a news article, produce a single sentence summary of the content of the article.
### Credit
#### Curation Organization Type(s)
<!-- info: In what kind of organization did the dataset curation happen? -->
<!-- scope: telescope -->
`academic`
#### Curation Organization(s)
<!-- info: Name the organization(s). -->
<!-- scope: periscope -->
University of Edinburgh
#### Dataset Creators
<!-- info: Who created the original dataset? List the people involved in collecting the dataset and their affiliation(s). -->
<!-- scope: microscope -->
Shashi Narayan, Shay B. Cohen, Mirella Lapata (all affiliated with University of Edinburgh at the time of dataset creation)
#### Funding
<!-- info: Who funded the data creation? -->
<!-- scope: microscope -->
European Research Council (Lapata; award number 681760), the European Union under the Horizon 2020 SUMMA project (Narayan, Cohen; grant agreement 688139), and Huawei Technologies (Cohen).
#### Who added the Dataset to GEM?
<!-- info: Who contributed to the data card and adding the dataset to GEM? List the people+affiliations involved in creating this data card and who helped integrate this dataset into GEM. -->
<!-- scope: microscope -->
The original data card was written by Laura Perez-Beltrachini and the data loader by Yacine Jernite. Sebastian Gehrmann migrated the data card to the new format and extended it. The v2 data loader was migrated by Abinaya Mahendiran
### Dataset Structure
#### Data Fields
<!-- info: List and describe the fields present in the dataset. -->
<!-- scope: telescope -->
- `Document`: Input news article.
- `Summary`: One sentence summary of the article.
- `Id`: BBC ID of the article.
#### Reason for Structure
<!-- info: How was the dataset structure determined? -->
<!-- scope: microscope -->
The Document/Summary format is standard for summarization datasets.
#### How were labels chosen?
<!-- info: How were the labels chosen? -->
<!-- scope: microscope -->
The labels are the first sentence of the source article.
#### Example Instance
<!-- info: Provide a JSON formatted example of a typical instance in the dataset. -->
<!-- scope: periscope -->
```
{
'document': 'The researchers have sequenced the genome of a strain of bacterium that causes the virulent infection.\nA survey in 2007 showed that bleeding canker had spread rapidly, with almost half of the two million horse chestnuts displaying symptoms of the disease.\nThe findings have been published in the journal PLoS One.\nA visible symptom of the disease is a lesion on the bark, which oozes a resin on to the trunk or sometimes the branches.\nThe bark underneath the canker is killed, and if cankers manage to go all the way around the trunk then the horse chestnut (Aesculus hippocastanum) will die because it cuts off the food supply. [...]',
'target': "A team of UK scientists hopes to shed light on the mysteries of bleeding canker, a disease that is threatening the nation's horse chestnut trees.",
}
```
#### Data Splits
<!-- info: Describe and name the splits in the dataset if there are more than one. -->
<!-- scope: periscope -->
| Section | Number of Documents |
| ------------- |:-------------:|
| Training | 204,045 |
| Validation | 11,332 |
| Testing | 11,334 |
| Total | 226k |
| Section | number of words| number of sentences |
| ------------- |:-------------:| :-------------:|
| Documents | 431.07 | 19.77 |
| Summary | 23.26 | 1.00 |
#### Splitting Criteria
<!-- info: Describe any criteria for splitting the data, if used. If there are differences between the splits (e.g., if the training annotations are machine-generated and the dev and test ones are created by humans, or if different numbers of annotators contributed to each example), describe them here. -->
<!-- scope: microscope -->
The identifiers in the URLs were used to randomly split the dataset into training (90%, 204,045), validation (5%, 11,332), and test (5%, 11,334) sets.
## Dataset Curation
### Original Curation
#### Original Curation Rationale
<!-- info: Original curation rationale -->
<!-- scope: telescope -->
Comparable datasets are often very extractive which is not a strategy that works for one-sentence summaries. The dataset curators thus created this dataset as a way to evaluate truly abstractive models
#### Communicative Goal
<!-- info: What was the communicative goal? -->
<!-- scope: periscope -->
Same as the communicative goal in GEM: A model should summarize a news article in a single sentence
#### Sourced from Different Sources
<!-- info: Is the dataset aggregated from different data sources? -->
<!-- scope: telescope -->
no
### Language Data
#### How was Language Data Obtained?
<!-- info: How was the language data obtained? -->
<!-- scope: telescope -->
`Found`
#### Where was it found?
<!-- info: If found, where from? -->
<!-- scope: telescope -->
`Single website`
#### Language Producers
<!-- info: What further information do we have on the language producers? -->
<!-- scope: microscope -->
The data was collected from articles between 2010 and 2017. No other information
#### Topics Covered
<!-- info: Does the language in the dataset focus on specific topics? How would you describe them? -->
<!-- scope: periscope -->
The collected articles included the following topics: News, Politics, Sports, Weather, Business, Technology, Science, Health, Family, Education, Entertainment and Arts
The dataset curators also used LDA to gain insight into this question and found that the following were the top keywords associated with each topic:
- **T1**: charge, court, murder, police, arrest, guilty, sentence, boy, bail, space, crown, trial
- **T2**: church, abuse, bishop, child, catholic, gay, pope, school, christian, priest, cardinal
- **T3**: council, people, government, local, housing, home, house, property, city, plan, authority
- **T4**: clinton, party, trump, climate, poll, vote, plaid, election, debate, change, candidate, campaign
- **T5**: country, growth, report, business, export, fall, bank, security, economy, rise, global, inflation
- **T6**: hospital, patient, trust, nhs, people, care, health, service, staff, report, review, system, child
#### Data Validation
<!-- info: Was the text validated by a different worker or a data curator? -->
<!-- scope: telescope -->
not validated
#### Data Preprocessing
<!-- info: How was the text data pre-processed? (Enter N/A if the text was not pre-processed) -->
<!-- scope: microscope -->
The text was extracted from the HTML of the webpage. No further processing was done.
#### Was Data Filtered?
<!-- info: Were text instances selected or filtered? -->
<!-- scope: telescope -->
not filtered
### Structured Annotations
#### Additional Annotations?
<!-- quick -->
<!-- info: Does the dataset have additional annotations for each instance? -->
<!-- scope: telescope -->
none
#### Annotation Service?
<!-- info: Was an annotation service used? -->
<!-- scope: telescope -->
no
### Consent
#### Any Consent Policy?
<!-- info: Was there a consent policy involved when gathering the data? -->
<!-- scope: telescope -->
no
#### Justification for Using the Data
<!-- info: If not, what is the justification for reusing the data? -->
<!-- scope: microscope -->
The copyright license of the data allows reusing it for this purpose.
### Private Identifying Information (PII)
#### Contains PII?
<!-- quick -->
<!-- info: Does the source language data likely contain Personal Identifying Information about the data creators or subjects? -->
<!-- scope: telescope -->
yes/very likely
#### Categories of PII
<!-- info: What categories of PII are present or suspected in the data? -->
<!-- scope: periscope -->
`generic PII`
#### Any PII Identification?
<!-- info: Did the curators use any automatic/manual method to identify PII in the dataset? -->
<!-- scope: periscope -->
no identification
### Maintenance
#### Any Maintenance Plan?
<!-- info: Does the original dataset have a maintenance plan? -->
<!-- scope: telescope -->
no
## Broader Social Context
### Previous Work on the Social Impact of the Dataset
#### Usage of Models based on the Data
<!-- info: Are you aware of cases where models trained on the task featured in this dataset ore related tasks have been used in automated systems? -->
<!-- scope: telescope -->
no
### Impact on Under-Served Communities
#### Addresses needs of underserved Communities?
<!-- info: Does this dataset address the needs of communities that are traditionally underserved in language technology, and particularly language generation technology? Communities may be underserved for exemple because their language, language variety, or social or geographical context is underepresented in NLP and NLG resources (datasets and models). -->
<!-- scope: telescope -->
no
### Discussion of Biases
#### Any Documented Social Biases?
<!-- info: Are there documented social biases in the dataset? Biases in this context are variations in the ways members of different social categories are represented that can have harmful downstream consequences for members of the more disadvantaged group. -->
<!-- scope: telescope -->
unsure
#### Are the Language Producers Representative of the Language?
<!-- info: Does the distribution of language producers in the dataset accurately represent the full distribution of speakers of the language world-wide? If not, how does it differ? -->
<!-- scope: periscope -->
The language and content of the data is focused on news and language in the UK and as such not representative of the speakers world-wide. Existing selection biases of the BBC exist in this dataset.
|
ylacombe/tiny-humming | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: description
dtype: string
splits:
- name: train
num_bytes: 24599934.0
num_examples: 11
download_size: 23344171
dataset_size: 24599934.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AlekseyKorshuk/product-photography-tiny-eval | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: image_mask
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 3025814.0
num_examples: 10
download_size: 3006448
dataset_size: 3025814.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_danielhanchen__open_llama_3b_600bt_preview | ---
pretty_name: Evaluation run of danielhanchen/open_llama_3b_600bt_preview
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [danielhanchen/open_llama_3b_600bt_preview](https://huggingface.co/danielhanchen/open_llama_3b_600bt_preview)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_danielhanchen__open_llama_3b_600bt_preview\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T13:47:34.979572](https://huggingface.co/datasets/open-llm-leaderboard/details_danielhanchen__open_llama_3b_600bt_preview/blob/main/results_2023-09-22T13-47-34.979572.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n\
\ \"em_stderr\": 0.0003144653119413175,\n \"f1\": 0.04996329697986588,\n\
\ \"f1_stderr\": 0.0012567293128089149,\n \"acc\": 0.32150142444857593,\n\
\ \"acc_stderr\": 0.007826931083969837\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.0003144653119413175,\n\
\ \"f1\": 0.04996329697986588,\n \"f1_stderr\": 0.0012567293128089149\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006065200909780136,\n \
\ \"acc_stderr\": 0.002138670301460455\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6369376479873717,\n \"acc_stderr\": 0.01351519186647922\n\
\ }\n}\n```"
repo_url: https://huggingface.co/danielhanchen/open_llama_3b_600bt_preview
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T13_47_34.979572
path:
- '**/details_harness|drop|3_2023-09-22T13-47-34.979572.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T13-47-34.979572.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T13_47_34.979572
path:
- '**/details_harness|gsm8k|5_2023-09-22T13-47-34.979572.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T13-47-34.979572.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T13_47_34.979572
path:
- '**/details_harness|winogrande|5_2023-09-22T13-47-34.979572.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T13-47-34.979572.parquet'
- config_name: results
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- results_2023-07-19T15:00:20.394414.parquet
- split: 2023_09_22T13_47_34.979572
path:
- results_2023-09-22T13-47-34.979572.parquet
- split: latest
path:
- results_2023-09-22T13-47-34.979572.parquet
---
# Dataset Card for Evaluation run of danielhanchen/open_llama_3b_600bt_preview
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/danielhanchen/open_llama_3b_600bt_preview
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [danielhanchen/open_llama_3b_600bt_preview](https://huggingface.co/danielhanchen/open_llama_3b_600bt_preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_danielhanchen__open_llama_3b_600bt_preview",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T13:47:34.979572](https://huggingface.co/datasets/open-llm-leaderboard/details_danielhanchen__open_llama_3b_600bt_preview/blob/main/results_2023-09-22T13-47-34.979572.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413175,
"f1": 0.04996329697986588,
"f1_stderr": 0.0012567293128089149,
"acc": 0.32150142444857593,
"acc_stderr": 0.007826931083969837
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413175,
"f1": 0.04996329697986588,
"f1_stderr": 0.0012567293128089149
},
"harness|gsm8k|5": {
"acc": 0.006065200909780136,
"acc_stderr": 0.002138670301460455
},
"harness|winogrande|5": {
"acc": 0.6369376479873717,
"acc_stderr": 0.01351519186647922
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Multimodal-Fatima/DTD_parition1_test_facebook_opt_6.7b_Attributes_ns_1880 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_16
num_bytes: 91665552.0
num_examples: 1880
- name: fewshot_1_bs_16
num_bytes: 92070944.0
num_examples: 1880
- name: fewshot_3_bs_16
num_bytes: 92895854.0
num_examples: 1880
- name: fewshot_5_bs_16
num_bytes: 93723701.0
num_examples: 1880
- name: fewshot_8_bs_16
num_bytes: 94963856.0
num_examples: 1880
download_size: 451991650
dataset_size: 465319907.0
---
# Dataset Card for "DTD_parition1_test_facebook_opt_6.7b_Attributes_ns_1880"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hhhwmws/zhouzhiruo | ---
license: cc-by-4.0
task_categories:
- text-generation
language:
- zh
size_categories:
- 1K<n<10K
---
支持ChatHaruhi2 的周芷若数据,可以使用如下方式调用
```python
from chatharuhi import ChatHaruhi
chatbot = ChatHaruhi( role_from_hf = 'hhhwmws/zhouzhiruo', \
llm = 'openai')
response = chatbot.chat(role='张无忌', text = '周芷若!')
print(response)
```
上传者: 米唯实
更具体的信息,见 [ChatHaruhi](https://github.com/LC1332/Chat-Haruhi-Suzumiya)
欢迎加入我们的 [众筹角色创建项目](https://github.com/LC1332/Chat-Haruhi-Suzumiya/tree/main/characters/novel_collecting)
### Citation引用
Please cite the repo if you use the data or code in this repo.
```
@misc{li2023chatharuhi,
title={ChatHaruhi: Reviving Anime Character in Reality via Large Language Model},
author={Cheng Li and Ziang Leng and Chenxi Yan and Junyi Shen and Hao Wang and Weishi MI and Yaying Fei and Xiaoyang Feng and Song Yan and HaoSheng Wang and Linkang Zhan and Yaokai Jia and Pingyu Wu and Haozhen Sun},
year={2023},
eprint={2308.09597},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
one-sec-cv12/chunk_38 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 25316716032.0
num_examples: 263584
download_size: 21757615757
dataset_size: 25316716032.0
---
# Dataset Card for "chunk_38"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
napatswift/thbud-doc | ---
dataset_info:
features:
- name: words
sequence: string
- name: norm_bboxes
sequence:
sequence: float64
- name: ner_tags
sequence: 'null'
- name: class
dtype:
class_label:
names:
0: toc
1: entry
2: other
- name: image
dtype: image
splits:
- name: train
num_bytes: 166520938.02956522
num_examples: 862
- name: test
num_bytes: 57215447.970434785
num_examples: 288
download_size: 209131993
dataset_size: 223736386.0
---
# Dataset Card for "thbud-doc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
theojiang/contrastive_conditional_vid_diff_std_1_18_MSRVTT-train | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 6930335.0
num_examples: 200
download_size: 6797378
dataset_size: 6930335.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/data-standardized_cluster_12_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 6875828
num_examples: 6416
download_size: 2875603
dataset_size: 6875828
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_12_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andrewatef/MyPro | ---
license: apache-2.0
task_categories:
- text-generation
tags:
- legal
size_categories:
- n<1K
--- |
DatasetingBR/LucasGuarda | ---
license: openrail
---
|
plaguss/end2end_textclassification | ---
dataset_info:
features:
- name: text
dtype: string
id: field
- name: label
list:
- name: user_id
dtype: string
id: question
- name: value
dtype: string
id: suggestion
- name: status
dtype: string
id: question
- name: label-suggestion
dtype: string
id: suggestion
- name: label-suggestion-metadata
struct:
- name: type
dtype: string
id: suggestion-metadata
- name: score
dtype: float32
id: suggestion-metadata
- name: agent
dtype: string
id: suggestion-metadata
- name: external_id
dtype: string
id: external_id
- name: metadata
dtype: string
id: metadata
splits:
- name: train
num_bytes: 343408
num_examples: 1000
download_size: 181964
dataset_size: 343408
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "end2end_textclassification"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hk-kaden-kim/uzh-hs23-etsp-eval-single-noaxislabel-line | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: test
num_bytes: 3500006.0
num_examples: 100
download_size: 3486375
dataset_size: 3500006.0
---
# Dataset Card for "uzh-hs23-etsp-eval-single-noaxislabel-line"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-professional_psychology-rule-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 228241
num_examples: 612
download_size: 132982
dataset_size: 228241
---
# Dataset Card for "mmlu-professional_psychology-rule-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_RubielLabarta__LogoS-7Bx2-MoE-13B-v0.2 | ---
pretty_name: Evaluation run of RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2](https://huggingface.co/RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RubielLabarta__LogoS-7Bx2-MoE-13B-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-11T17:52:31.585367](https://huggingface.co/datasets/open-llm-leaderboard/details_RubielLabarta__LogoS-7Bx2-MoE-13B-v0.2/blob/main/results_2024-02-11T17-52-31.585367.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6562860847489057,\n\
\ \"acc_stderr\": 0.032004990310050704,\n \"acc_norm\": 0.654707691151165,\n\
\ \"acc_norm_stderr\": 0.03269649322595469,\n \"mc1\": 0.6193390452876377,\n\
\ \"mc1_stderr\": 0.01699762787190791,\n \"mc2\": 0.7452565487832791,\n\
\ \"mc2_stderr\": 0.014341967286352852\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7141638225255973,\n \"acc_stderr\": 0.013203196088537372,\n\
\ \"acc_norm\": 0.7440273037542662,\n \"acc_norm_stderr\": 0.012753013241244521\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7267476598287194,\n\
\ \"acc_stderr\": 0.004447185883327435,\n \"acc_norm\": 0.8908583947420833,\n\
\ \"acc_norm_stderr\": 0.003111795320787943\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\"\
: 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"\
acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047709,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047709\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.02615686752393104,\n\
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02615686752393104\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.0133878957315436,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.0133878957315436\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4592178770949721,\n\
\ \"acc_stderr\": 0.016666783616525776,\n \"acc_norm\": 0.4592178770949721,\n\
\ \"acc_norm_stderr\": 0.016666783616525776\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667878,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n\
\ \"acc_stderr\": 0.01274085387294983,\n \"acc_norm\": 0.4661016949152542,\n\
\ \"acc_norm_stderr\": 0.01274085387294983\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6193390452876377,\n\
\ \"mc1_stderr\": 0.01699762787190791,\n \"mc2\": 0.7452565487832791,\n\
\ \"mc2_stderr\": 0.014341967286352852\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8839779005524862,\n \"acc_stderr\": 0.009000656983537947\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7156937073540561,\n \
\ \"acc_stderr\": 0.012425078188395982\n }\n}\n```"
repo_url: https://huggingface.co/RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|arc:challenge|25_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|gsm8k|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hellaswag|10_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T17-52-31.585367.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T17-52-31.585367.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- '**/details_harness|winogrande|5_2024-02-11T17-52-31.585367.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-11T17-52-31.585367.parquet'
- config_name: results
data_files:
- split: 2024_02_11T17_52_31.585367
path:
- results_2024-02-11T17-52-31.585367.parquet
- split: latest
path:
- results_2024-02-11T17-52-31.585367.parquet
---
# Dataset Card for Evaluation run of RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2](https://huggingface.co/RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RubielLabarta__LogoS-7Bx2-MoE-13B-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T17:52:31.585367](https://huggingface.co/datasets/open-llm-leaderboard/details_RubielLabarta__LogoS-7Bx2-MoE-13B-v0.2/blob/main/results_2024-02-11T17-52-31.585367.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6562860847489057,
"acc_stderr": 0.032004990310050704,
"acc_norm": 0.654707691151165,
"acc_norm_stderr": 0.03269649322595469,
"mc1": 0.6193390452876377,
"mc1_stderr": 0.01699762787190791,
"mc2": 0.7452565487832791,
"mc2_stderr": 0.014341967286352852
},
"harness|arc:challenge|25": {
"acc": 0.7141638225255973,
"acc_stderr": 0.013203196088537372,
"acc_norm": 0.7440273037542662,
"acc_norm_stderr": 0.012753013241244521
},
"harness|hellaswag|10": {
"acc": 0.7267476598287194,
"acc_stderr": 0.004447185883327435,
"acc_norm": 0.8908583947420833,
"acc_norm_stderr": 0.003111795320787943
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544064,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544064
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02615686752393104,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02615686752393104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.0133878957315436,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.0133878957315436
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4592178770949721,
"acc_stderr": 0.016666783616525776,
"acc_norm": 0.4592178770949721,
"acc_norm_stderr": 0.016666783616525776
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667878,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.01274085387294983,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.01274085387294983
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6193390452876377,
"mc1_stderr": 0.01699762787190791,
"mc2": 0.7452565487832791,
"mc2_stderr": 0.014341967286352852
},
"harness|winogrande|5": {
"acc": 0.8839779005524862,
"acc_stderr": 0.009000656983537947
},
"harness|gsm8k|5": {
"acc": 0.7156937073540561,
"acc_stderr": 0.012425078188395982
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/acerola_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of acerola/アセロラ (Pokémon)
This is the dataset of acerola/アセロラ (Pokémon), containing 500 images and their tags.
The core tags of this character are `purple_hair, hair_ornament, flipped_hair, short_hair, bangs, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 476.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acerola_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 295.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acerola_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1073 | 591.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acerola_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 432.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acerola_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1073 | 807.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acerola_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/acerola_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 33 |  |  |  |  |  | 1girl, multicolored_dress, stitches, topknot, torn_dress, short_sleeves, armlet, open_mouth, hairclip, collarbone, tongue, medium_hair, :d, pokemon_(creature), eyelashes, looking_at_viewer, grey_dress, blush, :3, grey_eyes, hands_up |
| 1 | 5 |  |  |  |  |  | 1girl, :3, armlet, closed_mouth, flip-flops, grey_dress, hairclip, medium_hair, multicolored_dress, short_sleeves, smile, solo, stitches, toes, topknot, collarbone, full_body, grey_eyes, standing, torn_dress, looking_at_viewer, white_background, blue_dress, eyelashes, simple_background |
| 2 | 9 |  |  |  |  |  | 1girl, armlet, looking_at_viewer, short_sleeves, topknot, :3, closed_mouth, multicolored_dress, simple_background, solo, stitches, blush, smile, torn_dress, white_background, collarbone, grey_dress, hairclip |
| 3 | 9 |  |  |  |  |  | 1girl, :3, sandals, short_sleeves, smile, stitches, armlet, full_body, topknot, torn_dress, blush, collarbone, simple_background, standing, white_background, open_mouth, pokemon_(creature), solo |
| 4 | 11 |  |  |  |  |  | 1girl, blush, navel, looking_at_viewer, official_alternate_costume, open_mouth, hair_flower, topknot, eyelashes, necklace, tongue, floral_print, solo, :d, collarbone, medium_hair, bikini, day, bracelet, outdoors, pokemon_(creature), sarong |
| 5 | 35 |  |  |  |  |  | hood_up, official_alternate_costume, 1girl, eyelashes, tongue, open_mouth, single_glove, bead_bracelet, hooded_capelet, black_gloves, pantyhose, hands_up, blush, looking_at_viewer, medium_hair, pokemon_(creature), orange_shorts, striped_shorts, themed_object, cosplay, shoes, halloween, :d, vertical_stripes, solo |
| 6 | 9 |  |  |  |  |  | 1boy, 1girl, hetero, open_mouth, blush, navel, nipples, sex, solo_focus, vaginal, nude, spread_legs, cum_in_pussy, bar_censor, collarbone, tongue, topknot, medium_breasts, on_back, veiny_penis |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | multicolored_dress | stitches | topknot | torn_dress | short_sleeves | armlet | open_mouth | hairclip | collarbone | tongue | medium_hair | :d | pokemon_(creature) | eyelashes | looking_at_viewer | grey_dress | blush | :3 | grey_eyes | hands_up | closed_mouth | flip-flops | smile | solo | toes | full_body | standing | white_background | blue_dress | simple_background | sandals | navel | official_alternate_costume | hair_flower | necklace | floral_print | bikini | day | bracelet | outdoors | sarong | hood_up | single_glove | bead_bracelet | hooded_capelet | black_gloves | pantyhose | orange_shorts | striped_shorts | themed_object | cosplay | shoes | halloween | vertical_stripes | 1boy | hetero | nipples | sex | solo_focus | vaginal | nude | spread_legs | cum_in_pussy | bar_censor | medium_breasts | on_back | veiny_penis |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------------|:-----------|:----------|:-------------|:----------------|:---------|:-------------|:-----------|:-------------|:---------|:--------------|:-----|:---------------------|:------------|:--------------------|:-------------|:--------|:-----|:------------|:-----------|:---------------|:-------------|:--------|:-------|:-------|:------------|:-----------|:-------------------|:-------------|:--------------------|:----------|:--------|:-----------------------------|:--------------|:-----------|:---------------|:---------|:------|:-----------|:-----------|:---------|:----------|:---------------|:----------------|:-----------------|:---------------|:------------|:----------------|:-----------------|:----------------|:----------|:--------|:------------|:-------------------|:-------|:---------|:----------|:------|:-------------|:----------|:-------|:--------------|:---------------|:-------------|:-----------------|:----------|:--------------|
| 0 | 33 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | | X | | | X | X | X | | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | | | | | | X | X | X | X | | | X | | X | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | | X | X | X | X | X | X | | X | | | | X | | | | X | X | | | | | X | X | | X | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 11 |  |  |  |  |  | X | | | X | | | | X | | X | X | X | X | X | X | X | | X | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 35 |  |  |  |  |  | X | | | | | | | X | | | X | X | X | X | X | X | | X | | | X | | | | X | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | | X | | | | X | | X | X | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Wallace-fantini/llma2-fantini | ---
license: llama2
---
|
cakiki/cmake_paths | ---
dataset_info:
features:
- name: repository_name
dtype: string
splits:
- name: train
num_bytes: 14898478
num_examples: 559316
download_size: 7920865
dataset_size: 14898478
---
# Dataset Card for "cmake_paths"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ccmusic-database/GZ_IsoTech | ---
license: mit
task_categories:
- audio-classification
language:
- zh
- en
tags:
- music
- art
pretty_name: GZ_IsoTech Dataset
size_categories:
- n<1K
viewer: false
---
# Dataset Card for GZ_IsoTech Dataset
The raw dataset comprises 2,824 audio clips showcasing various guzheng playing techniques. Specifically, 2,328 clips were sourced from virtual sound banks, while 496 clips were performed by a skilled professional guzheng artist. These recordings encompass a comprehensive range of tones inherent to the guzheng instrument.
## Dataset Description
- **Homepage:** <https://ccmusic-database.github.io>
- **Repository:** <https://huggingface.co/datasets/ccmusic-database/Guzheng_Tech99>
- **Paper:** <https://doi.org/10.5281/zenodo.5676893>
- **Leaderboard:** <https://www.modelscope.cn/datasets/ccmusic/GZ_IsoTech>
- **Point of Contact:** <https://arxiv.org/abs/2209.08774>
### Dataset Summary
Due to the pre-existing split in the raw dataset, wherein the data has been partitioned approximately in a 4:1 ratio for training and testing sets, we uphold the original data division approach. In contrast to utilizing platform-specific automated splitting mechanisms, we directly employ the pre-split data for subsequent integration steps.
### Supported Tasks and Leaderboards
MIR, audio classification
### Languages
Chinese, English
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("ccmusic-database/GZ_IsoTech")
for item in ds["train"]:
print(item)
for item in ds["test"]:
print(item)
```
## Maintenance
```bash
GIT_LFS_SKIP_SMUDGE=1 git clone git@hf.co:datasets/ccmusic-database/GZ_IsoTech
cd GZ_IsoTech
```
## Dataset Structure
| audio(.wav, 22050Hz) | mel(.jpg, 22050Hz) | label | cname |
| :----------------------------------------------------------------------------------------------------------------------: | :------------------------------------: | :-----: | :----: |
| <audio controls src="https://huggingface.co/datasets/ccmusic-database/GZ_IsoTech/resolve/main/data/record_chanyin1.wav"> | <img src="./data/record_chanyin1.jpg"> | 8-class | string |
| ... | ... | ... | ... |
### Data Instances
.zip(.flac, .csv)
### Data Fields
Categorization of the clips is based on the diverse playing techniques characteristic of the guzheng, the clips are divided into eight categories: Vibrato (chanyin), Upward Portamento (shanghuayin), Downward Portamento (xiahuayin), Returning Portamento (huihuayin), Glissando (guazou, huazhi), Tremolo (yaozhi), Harmonic (fanyin), Plucks (gou, da, mo, tuo…).
### Data Splits
train, test
## Dataset Creation
### Curation Rationale
The Guzheng is a kind of traditional Chinese instrument with diverse playing techniques. Instrument playing techniques (IPT) play an important role in musical performance. However, most of the existing works for IPT detection show low efficiency for variable-length audio and do not assure generalization as they rely on a single sound bank for training and testing. In this study, we propose an end-to-end Guzheng playing technique detection system using Fully Convolutional Networks that can be applied to variable-length audio. Because each Guzheng playing technique is applied to a note, a dedicated onset detector is trained to divide an audio into several notes and its predictions are fused with frame-wise IPT predictions. During fusion, we add the IPT predictions frame by frame inside each note and get the IPT with the highest probability within each note as the final output of that note. We create a new dataset named GZ_IsoTech from multiple sound banks and real-world recordings for Guzheng performance analysis. Our approach achieves 87.97% in frame-level accuracy and 80.76% in note-level F1 score, outperforming existing works by a large margin, which indicates the effectiveness of our proposed method in IPT detection.
### Source Data
#### Initial Data Collection and Normalization
Dichucheng Li, Monan Zhou
#### Who are the source language producers?
Students from FD-LAMT
### Annotations
#### Annotation process
This database contains 2824 audio clips of guzheng playing techniques. Among them, 2328 pieces were collected from virtual sound banks, and 496 pieces were played and recorded by a professional guzheng performer.
#### Who are the annotators?
Students from FD-LAMT
### Personal and Sensitive Information
None
## Considerations for Using the Data
### Social Impact of Dataset
Promoting the development of the music AI industry
### Discussion of Biases
Only for Traditional Chinese Instruments
### Other Known Limitations
Insufficient sample
## Additional Information
### Dataset Curators
Dichucheng Li
### Evaluation
[Li, Dichucheng, Yulun Wu, Qinyu Li, Jiahao Zhao, Yi Yu, Fan Xia and Wei Li. “Playing Technique Detection by Fusing Note Onset Information in Guzheng Performance.” International Society for Music Information Retrieval Conference (2022).](https://archives.ismir.net/ismir2022/paper/000037.pdf)
### Licensing Information
```
MIT License
Copyright (c) FD-LAMT
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
```
### Citation Information
```bibtex
@dataset{zhaorui_liu_2021_5676893,
author = {Monan Zhou, Shenyang Xu, Zhaorui Liu, Zhaowen Wang, Feng Yu, Wei Li and Baoqiang Han},
title = {CCMusic: an Open and Diverse Database for Chinese and General Music Information Retrieval Research},
month = {mar},
year = {2024},
publisher = {HuggingFace},
version = {1.2},
url = {https://huggingface.co/ccmusic-database}
}
```
### Contributions
Promoting the development of the music AI industry |
matlok/python-image-copilot-training-using-class-knowledge-graphs-2024-01-27 | ---
license:
- other
pretty_name: >-
python copilot image training using class knowledge graphs updated 2024-01-27
dataset_info:
- config_name: v1_transformers_examples_pytorch
splits:
- name: v1_transformers_examples_pytorch
- config_name: v2_pytorch_torch_distributed_fsdp
splits:
- name: v2_pytorch_torch_distributed_fsdp
- config_name: v3_deepspeed_deepspeed_runtime
splits:
- name: v3_deepspeed_deepspeed_runtime
- config_name: v4_fused_gelu_testing_src
splits:
- name: v4_fused_gelu_testing_src
- config_name: v5_unsloth_unsloth_models
splits:
- name: v5_unsloth_unsloth_models
- config_name: v6_blip_models
splits:
- name: v6_blip_models
- config_name: v7_text_generation_inference_server_text_generation_server
splits:
- name: v7_text_generation_inference_server_text_generation_server
- config_name: v8_spark_python_pyspark_pandas_plot
splits:
- name: v8_spark_python_pyspark_pandas_plot
- config_name: view_schema
splits:
- name: view_schema
configs:
- config_name: v1_transformers_examples_pytorch
data_files:
- split: v1_transformers_examples_pytorch
path: train/train-0002-transformers-examples-pytorch.parquet
- config_name: v2_pytorch_torch_distributed_fsdp
data_files:
- split: v2_pytorch_torch_distributed_fsdp
path: train/train-0003-pytorch-torch-distributed-fsdp.parquet
- config_name: v3_deepspeed_deepspeed_runtime
data_files:
- split: v3_deepspeed_deepspeed_runtime
path: train/train-0004-deepspeed-deepspeed-runtime.parquet
- config_name: v4_fused_gelu_testing_src
data_files:
- split: v4_fused_gelu_testing_srck
path: train/train-0005-fused-gelu-testing-src.parquet
- config_name: v5_unsloth_unsloth_models
data_files:
- split: v5_unsloth_unsloth_models
path: train/train-0006-unsloth-unsloth-models.parquet
- config_name: v6_blip_models
data_files:
- split: v6_blip_models
path: train/train-0007-blip-models.parquet
- config_name: v7_text_generation_inference_server_text_generation_server
data_files:
- split: v7_text_generation_inference_server_text_generation_server
path: train/train-0008-text-generation-inference-server-text_generation_server.parquet
- config_name: v8_spark_python_pyspark_pandas_plot
data_files:
- split: v8_spark_python_pyspark_pandas_plot
path: train/train-0009-spark-python-pyspark-pandas-plot.parquet
- config_name: view_schema
data_files:
- split: view_schema
path: files/lok-python-copilot-image.class-v1_00003555.parquet
size_categories:
- 100K<n<1M
tags:
- python-copilot
- python-coding
- python-architecture
- knowledge-graphs
- multimodal
- text-image-audio
- fine-tuning
- training
- question-answering
- image-knowledge-graph
- alpaca
- mp3
- png
- text
- instruct
- class
- classes
# supported task_categories
# text-classification, token-classification, table-question-answering, question-answering, zero-shot-classification, translation, summarization, conversational, feature-extraction, text-generation, text2text-generation, fill-mask, sentence-similarity, text-to-speech, text-to-audio, automatic-speech-recognition, audio-to-audio, audio-classification, voice-activity-detection, depth-estimation, image-classification, object-detection, image-segmentation, text-to-image, image-to-text, image-to-image, image-to-video, unconditional-image-generation, video-classification, reinforcement-learning, robotics, tabular-classification, tabular-regression, tabular-to-text, table-to-text, multiple-choice, text-retrieval, time-series-forecasting, text-to-video, visual-question-answering, document-question-answering, zero-shot-image-classification, graph-ml, mask-generation, zero-shot-object-detection, text-to-3d, image-to-3d, other
task_categories:
- text-to-image
- image-to-image
- question-answering
# supported task_ids
# acceptability-classification, entity-linking-classification, fact-checking, intent-classification, language-identification, multi-class-classification, multi-label-classification, multi-input-text-classification, natural-language-inference, semantic-similarity-classification, sentiment-classification, topic-classification, semantic-similarity-scoring, sentiment-scoring, sentiment-analysis, hate-speech-detection, text-scoring, named-entity-recognition, part-of-speech, parsing, lemmatization, word-sense-disambiguation, coreference-resolution, extractive-qa, open-domain-qa, closed-domain-qa, news-articles-summarization, news-articles-headline-generation, dialogue-generation, dialogue-modeling, language-modeling, text-simplification, explanation-generation, abstractive-qa, open-domain-abstractive-qa, closed-domain-qa, open-book-qa, closed-book-qa, slot-filling, masked-language-modeling, keyword-spotting, speaker-identification, audio-intent-classification, audio-emotion-recognition, audio-language-identification, multi-label-image-classification, multi-class-image-classification, face-detection, vehicle-detection, instance-segmentation, semantic-segmentation, panoptic-segmentation, image-captioning, image-inpainting, image-colorization, super-resolution, grasping, task-planning, tabular-multi-class-classification, tabular-multi-label-classification, tabular-single-column-regression, rdf-to-text, multiple-choice-qa, multiple-choice-coreference-resolution, document-retrieval, utterance-retrieval, entity-linking-retrieval, fact-checking-retrieval, univariate-time-series-forecasting, multivariate-time-series-forecasting, visual-question-answering, document-question-answering
task_ids:
- parsing
---
## Python Copilot Image Training using Class Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each row contains a png file in the **dbytes** column.
- Rows: 312836
- Size: 294.1 GB
- Data type: png
- Format: Knowledge graph using NetworkX with alpaca text box
### Schema
The png is in the **dbytes** column:
```
{
"dbytes": "binary",
"dbytes_len": "int64",
"dbytes_mb": "float64",
"filename": "string",
"path": "string",
"repo": "string",
"type": "string"
}
```
### How to use the dataset
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-image-copilot-training-using-class-knowledge-graphs-2024-01-27", data_dir="files")
```
|
greathero/evenmoreevenmoreevenmorenewercontrailsvalidationdataset | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 493022743.945
num_examples: 16695
download_size: 477064136
dataset_size: 493022743.945
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
oegbo/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245925
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
harouzie/vietnews | ---
license: apache-2.0
dataset_info:
features:
- name: guid
dtype: int64
- name: title
dtype: string
- name: abstract
dtype: string
- name: article
dtype: string
splits:
- name: train
num_bytes: 325418455
num_examples: 99134
- name: validation
num_bytes: 73397317
num_examples: 22184
- name: test
num_bytes: 74536959
num_examples: 22498
download_size: 246782373
dataset_size: 473352731
language:
- vi
pretty_name: vietnews
task_categories:
- summarization
tags:
- finance
- legal
size_categories:
- 100K<n<1M
--- |
ourjames/Linda-Chase-Head-20170720 | ---
license: apache-2.0
---
|
tr416/dataset_20231006_192401 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 73925
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231006_192401"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mstz/balance_scale | ---
language:
- en
tags:
- balance_scale
- tabular_classification
- multiclass_classification
- binary_classification
- UCI
pretty_name: Balance
size_categories:
- n<1K
task_categories:
- tabular-classification
configs:
- balance
- is_balanced
---
# Balance scale
The [Balance scale dataset](https://archive-beta.ics.uci.edu/dataset/12/balance+scale) from the [UCI ML repository](https://archive.ics.uci.edu/ml/datasets).
Two weights are put on the arms of a scale. Where does the scale tilt?
# Configurations and tasks
| **Configuration** | **Task** | Description |
|-------------------|---------------------------|---------------------------------------------------------------|
| balance | Multiclass classification | Where does the scale tilt? |
| is_balanced | Binary classification | Does the scale tilt? |
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/balance_scale", "balance")["train"]
```
# Features
Target feature changes according to the selected configuration and is always in last position in the dataset. |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.0_seed_3_tp_0.7 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43666558
num_examples: 18928
- name: epoch_1
num_bytes: 44131220
num_examples: 18928
- name: epoch_2
num_bytes: 44221502
num_examples: 18928
- name: epoch_3
num_bytes: 44250409
num_examples: 18928
- name: epoch_4
num_bytes: 44273108
num_examples: 18928
- name: epoch_5
num_bytes: 44279451
num_examples: 18928
- name: epoch_6
num_bytes: 44279620
num_examples: 18928
- name: epoch_7
num_bytes: 44271457
num_examples: 18928
- name: epoch_8
num_bytes: 44268509
num_examples: 18928
- name: epoch_9
num_bytes: 44269772
num_examples: 18928
- name: epoch_10
num_bytes: 44265046
num_examples: 18928
- name: epoch_11
num_bytes: 44267650
num_examples: 18928
- name: epoch_12
num_bytes: 44267761
num_examples: 18928
- name: epoch_13
num_bytes: 44266259
num_examples: 18928
- name: epoch_14
num_bytes: 44265694
num_examples: 18928
- name: epoch_15
num_bytes: 44267406
num_examples: 18928
- name: epoch_16
num_bytes: 44266242
num_examples: 18928
- name: epoch_17
num_bytes: 44265158
num_examples: 18928
- name: epoch_18
num_bytes: 44266898
num_examples: 18928
- name: epoch_19
num_bytes: 44266264
num_examples: 18928
- name: epoch_20
num_bytes: 44268660
num_examples: 18928
- name: epoch_21
num_bytes: 44267640
num_examples: 18928
- name: epoch_22
num_bytes: 44266154
num_examples: 18928
- name: epoch_23
num_bytes: 44268320
num_examples: 18928
- name: epoch_24
num_bytes: 44266498
num_examples: 18928
- name: epoch_25
num_bytes: 44266204
num_examples: 18928
- name: epoch_26
num_bytes: 44266694
num_examples: 18928
- name: epoch_27
num_bytes: 44266989
num_examples: 18928
- name: epoch_28
num_bytes: 44264293
num_examples: 18928
- name: epoch_29
num_bytes: 44266970
num_examples: 18928
download_size: 690265588
dataset_size: 1327244406
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
wid4soe/182_simpsons | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: new_image
dtype: image
splits:
- name: train
num_bytes: 15287553.0
num_examples: 550
- name: test
num_bytes: 4336319.0
num_examples: 151
- name: valid
num_bytes: 1661844.0
num_examples: 54
download_size: 21245949
dataset_size: 21285716.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
Lollitor/FSONLYPROTEIN | ---
dataset_info:
features:
- name: '#code'
dtype: string
- name: inputs
dtype: string
splits:
- name: train
num_bytes: 14640735
num_examples: 16245
download_size: 233777
dataset_size: 14640735
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "FSONLYPROTEIN"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
berrypi/hu_corpora_parliament_processed | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 105042739
num_examples: 625178
download_size: 59005996
dataset_size: 105042739
---
# Dataset Card for "hu_corpora_parliament_processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceM4/MMBench_modif_chatbot_NoMCQ | Invalid username or password. |
tyzhu/wiki_find_passage_train10_eval20_title | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 31200
num_examples: 40
- name: validation
num_bytes: 15614
num_examples: 20
download_size: 35822
dataset_size: 46814
---
# Dataset Card for "wiki_find_passage_train10_eval20_title"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hooool/tilled | ---
license: mit
language:
- en
pretty_name: tilled zendesk
--- |
sihaochen/propsegment | ---
license: cc-by-4.0
task_categories:
- text-classification
- token-classification
- text-generation
language:
- en
tags:
- NLP
- Entailment
- NLI
- google-research-datasets
pretty_name: PropSegment
size_categories:
- 10K<n<100K
---
# PropSegmEnt: A Large-Scale Corpus for Proposition-Level Segmentation and Entailment Recognition
## Dataset Description
- **Homepage:** https://github.com/google-research-datasets/PropSegmEnt
- **Repository:** https://github.com/google-research-datasets/PropSegmEnt
- **Paper:** https://arxiv.org/abs/2212.10750
- **Point of Contact:** sihaoc@seas.upenn.edu
### Dataset Summary
This is a reproduced (i.e. after web-crawling) and processed version of [the "PropSegment" dataset](https://github.com/google-research-datasets/PropSegmEnt) from Google Research.
Since the [`News`](https://github.com/google-research-datasets/NewSHead) portion of the dataset is released only via urls, we reconstruct the dataset by crawling.
Overall, ~96% of the dataset can be reproduced, and the rest ~4% either have url no longer valid, or sentences that have been edited (i.e. cannot be aligned with the orignial dataset).
PropSegment (Proposition-level Segmentation and Entailment) is a large-scale, human annotated dataset for segmenting English text into propositions, and recognizing proposition-level entailment relations --- whether a different, related document entails each proposition, contradicts it, or neither.
The original dataset features >45k human annotated propositions, i.e. individual semantic units within sentences, as well as >35k entailment labels between propositions and documents.
Check out more details in the [dataset paper](https://arxiv.org/abs/2212.10750).
## Dataset Structure
Here we provide processed versions of the dataset for seq2seq model inputs/outputs.
`proposition_segmentation.*.jsonl` contains data for the text segmentation task, i.e. split a sentence into propositions.
The output propositions are concatenated as one string (with no particular order between them) by a special token `[SEP]`.
Each proposition is annotated as spans enclosed by `[M]` and `[/M]`.
```
{
"sentence": "This film marks the directorial debut for production designer Robert Stromberg.",
"propositions": "This film marks the directorial debut for [M]production designer Robert Stromberg.[/M][SEP]This [M]film marks the directorial debut for[/M] production designer [M]Robert Stromberg[/M]."
}
```
`propnli.*.jsonl` contains examples for the proposition-to-document entailment task, i.e. Given a proposition and a document, predict whether the proposition can be entailed/contradicted, or neutral with respect to the document.
```
{
"hypothesis": "[M]The Departed is[/M] a 2006 feature film [M]directed by Martin Scorsese.[/M]",
"premise": "The Departed is a 2006 American crime thriller film directed by Martin Scorsese and written by William Monahan. It starred Leonardo DiCaprio, Matt Damon, Jack Nicholson, and Mark Wahlberg, with Martin Sheen, Ray Winstone, Vera Farmiga, and Alec Baldwin in supporting roles. It is a remake of the Hong Kong film Infernal Affairs (2002).\nThe Departed won the Oscar for Best Picture at the 79th Academy Awards. Scorsese received the Oscar for Best Director, Thelma Schoonmaker the Oscar for Best Editing and William Monahan the Oscar for Best Adapted Screenplay.",
"label": "e"
}
```
### Citation
```
@inproceedings{chen2023propsegment,
title = "{PropSegmEnt}: A Large-Scale Corpus for Proposition-Level Segmentation and Entailment Recognition",
author = "Chen, Sihao and Buthpitiya, Senaka and Fabrikant, Alex and Roth, Dan and Schuster, Tal",
booktitle = "Findings of the Association for Computational Linguistics: ACL 2023",
year = "2023",
}
```
|
kamilakesbi/callhome_spa | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: timestamps_start
sequence: float64
- name: timestamps_end
sequence: float64
- name: speakers
sequence: string
splits:
- name: data
num_bytes: 2456735672.0
num_examples: 140
download_size: 2424929812
dataset_size: 2456735672.0
configs:
- config_name: default
data_files:
- split: data
path: data/data-*
---
|
GenVRadmin/Samvaad_Eng_Guj_Translation | ---
license: mit
---
|
brycewang2018/StataCodes | ---
license: apache-2.0
---
|
miulab/tmlu | ---
task_categories:
- question-answering
- text-classification
language:
- zh
pretty_name: TMLU
size_categories:
- 1K<n<10K
configs:
- config_name: AST_chinese
data_files:
- split: test
path: "AST_chinese_test.jsonl"
- split: dev
path: "AST_chinese_dev.jsonl"
- config_name: AST_mathematics
data_files:
- split: test
path: "AST_mathematics_test.jsonl"
- split: dev
path: "AST_mathematics_dev.jsonl"
- config_name: AST_biology
data_files:
- split: test
path: "AST_biology_test.jsonl"
- split: dev
path: "AST_biology_dev.jsonl"
- config_name: AST_chemistry
data_files:
- split: test
path: "AST_chemistry_test.jsonl"
- split: dev
path: "AST_chemistry_dev.jsonl"
- config_name: AST_physics
data_files:
- split: test
path: "AST_physics_test.jsonl"
- split: dev
path: "AST_physics_dev.jsonl"
- config_name: AST_civics
data_files:
- split: test
path: "AST_civics_test.jsonl"
- split: dev
path: "AST_civics_dev.jsonl"
- config_name: AST_geography
data_files:
- split: test
path: "AST_geography_test.jsonl"
- split: dev
path: "AST_geography_dev.jsonl"
- config_name: AST_history
data_files:
- split: test
path: "AST_history_test.jsonl"
- split: dev
path: "AST_history_dev.jsonl"
- config_name: GSAT_chinese
data_files:
- split: test
path: "GSAT_chinese_test.jsonl"
- split: dev
path: "GSAT_chinese_dev.jsonl"
- config_name: GSAT_chemistry
data_files:
- split: test
path: "GSAT_chemistry_test.jsonl"
- split: dev
path: "GSAT_chemistry_dev.jsonl"
- config_name: GSAT_biology
data_files:
- split: test
path: "GSAT_biology_test.jsonl"
- split: dev
path: "GSAT_biology_dev.jsonl"
- config_name: GSAT_physics
data_files:
- split: test
path: "GSAT_physics_test.jsonl"
- split: dev
path: "GSAT_physics_dev.jsonl"
- config_name: GSAT_earth_science
data_files:
- split: test
path: "GSAT_earth_science_test.jsonl"
- split: dev
path: "GSAT_earth_science_dev.jsonl"
- config_name: GSAT_mathematics
data_files:
- split: test
path: "GSAT_mathematics_test.jsonl"
- split: dev
path: "GSAT_mathematics_dev.jsonl"
- config_name: GSAT_geography
data_files:
- split: test
path: "GSAT_geography_test.jsonl"
- split: dev
path: "GSAT_geography_dev.jsonl"
- config_name: GSAT_history
data_files:
- split: test
path: "GSAT_history_test.jsonl"
- split: dev
path: "GSAT_history_dev.jsonl"
- config_name: GSAT_civics
data_files:
- split: test
path: "GSAT_civics_test.jsonl"
- split: dev
path: "GSAT_civics_dev.jsonl"
- config_name: CAP_mathematics
data_files:
- split: test
path: "CAP_mathematics_test.jsonl"
- split: dev
path: "CAP_mathematics_dev.jsonl"
- config_name: CAP_biology
data_files:
- split: test
path: "CAP_biology_test.jsonl"
- split: dev
path: "CAP_biology_dev.jsonl"
- config_name: CAP_physics
data_files:
- split: test
path: "CAP_physics_test.jsonl"
- split: dev
path: "CAP_physics_dev.jsonl"
- config_name: CAP_chemistry
data_files:
- split: test
path: "CAP_chemistry_test.jsonl"
- split: dev
path: "CAP_chemistry_dev.jsonl"
- config_name: CAP_earth_science
data_files:
- split: test
path: "CAP_earth_science_test.jsonl"
- split: dev
path: "CAP_earth_science_dev.jsonl"
- config_name: CAP_civics
data_files:
- split: test
path: "CAP_civics_test.jsonl"
- split: dev
path: "CAP_civics_dev.jsonl"
- config_name: CAP_history
data_files:
- split: test
path: "CAP_history_test.jsonl"
- split: dev
path: "CAP_history_dev.jsonl"
- config_name: CAP_geography
data_files:
- split: test
path: "CAP_geography_test.jsonl"
- split: dev
path: "CAP_geography_dev.jsonl"
- config_name: CAP_chinese
data_files:
- split: test
path: "CAP_chinese_test.jsonl"
- split: dev
path: "CAP_chinese_dev.jsonl"
- config_name: driving_rule
data_files:
- split: test
path: "driving_rule_test.jsonl"
- split: dev
path: "driving_rule_dev.jsonl"
- config_name: basic_traditional_chinese_medicine
data_files:
- split: test
path: "basic_traditional_chinese_medicine_test.jsonl"
- split: dev
path: "basic_traditional_chinese_medicine_dev.jsonl"
- config_name: clinical_traditional_chinese_medicine
data_files:
- split: test
path: "clinical_traditional_chinese_medicine_test.jsonl"
- split: dev
path: "clinical_traditional_chinese_medicine_dev.jsonl"
- config_name: lawyer_qualification
data_files:
- split: test
path: "lawyer_qualification_test.jsonl"
- split: dev
path: "lawyer_qualification_dev.jsonl"
- config_name: nutritionist
data_files:
- split: test
path: "nutritionist_test.jsonl"
- split: dev
path: "nutritionist_dev.jsonl"
- config_name: tour_leader
data_files:
- split: test
path: "tour_leader_test.jsonl"
- split: dev
path: "tour_leader_dev.jsonl"
- config_name: tour_guide
data_files:
- split: test
path: "tour_guide_test.jsonl"
- split: dev
path: "tour_guide_dev.jsonl"
- config_name: taiwan_tourist_resources
data_files:
- split: test
path: "taiwan_tourist_resources_test.jsonl"
- split: dev
path: "taiwan_tourist_resources_dev.jsonl"
- config_name: clinical_psychologist
data_files:
- split: test
path: "clinical_psychologist_test.jsonl"
- split: dev
path: "clinical_psychologist_dev.jsonl"
- config_name: teacher_qualification
data_files:
- split: test
path: "teacher_qualification_test.jsonl"
- split: dev
path: "teacher_qualification_dev.jsonl"
- config_name: accountant
data_files:
- split: test
path: "accountant_test.jsonl"
- split: dev
path: "accountant_dev.jsonl"
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
- AST: 分科測驗(110前指考)
- GSAT: 學科能力測驗
- CAP: 國中教育會考
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
### Evaluation
#### CAP
##### ChatGPT
Total: 199 / 389 (0.5116)
| Subject | Accuracy | correct / total |
|:------------- | -------- |:--------------- |
| chinese | 0.5179 | 29 / 56 |
| mathematics | 0.3273 | 36 / 110 |
| physics | 0.5000 | 5 / 10 |
| chemistry | 0.2727 | 6 / 22 |
| biology | 0.4545 | 10 / 22 |
| earth science | 0.4000 | 4 / 10 |
| geography | 0.5750 | 23 / 40 |
| history | 0.8235 | 42 / 51 |
| civics | 0.6471 | 44 / 68 |
##### GPT-4-turbo
Total: 289 / 389 (0.7429)
| Subject | Accuracy | correct / total |
|:------------- | -------- |:--------------- |
| chinese | 0.8571 | 48 / 56 |
| mathematics | 0.4000 | 44 / 110 |
| physics | 0.7000 | 7 / 10 |
| chemistry | 0.8182 | 18 / 22 |
| biology | 0.9091 | 20 / 22 |
| earth science | 0.8000 | 8 / 10 |
| geography | 0.9000 | 36 / 40 |
| history | 0.9608 | 49 / 51 |
| civics | 0.8676 | 59 / 68 |
##### Claude-Instant-1
Total: 214 / 389 (0.5501)
| Subject | Accuracy | correct / total |
|:------------- | -------- |:--------------- |
| chinese | 0.6071 | 34 / 56 |
| mathematics | 0.2636 | 29 / 110 |
| physics | 0.4000 | 4 / 10 |
| chemistry | 0.4545 | 10 / 22 |
| biology | 0.5909 | 13 / 22 |
| earth science | 0.4000 | 4 / 10 |
| geography | 0.6500 | 26 / 40 |
| history | 0.8431 | 43 / 51 |
| civics | 0.7500 | 51 / 68 |
##### Claude-2
Total: 213 / 389 (0.5476)
| Subject | Accuracy | correct / total |
|:------------- | -------- |:--------------- |
| chinese | 0.6071 | 34 / 56 |
| mathematics | 0.3727 | 41 / 110 |
| physics | 0.6000 | 6 / 10 |
| chemistry | 0.5000 | 11 / 22 |
| biology | 0.6364 | 14 / 22 |
| earth science | 0.7000 | 7 / 10 |
| geography | 0.7000 | 28 / 40 |
| history | 0.7255 | 37 / 51 |
| civics | 0.5147 | 35 / 68 |
#### GSAT
##### ChatGPT
Total: 180 / 387 (0.4651)
| Subject | Accuracy | correct / total |
|:------------- | -------- |:--------------- |
| chinese | 0.3587 | 33 / 92 |
| mathematics | 0.2083 | 5 / 24 |
| physics | 0.3684 | 7 / 19 |
| chemistry | 0.2917 | 7 / 24 |
| biology | 0.2500 | 4 / 16 |
| earth science | 0.4211 | 8 / 19 |
| geography | 0.5455 | 24 / 44 |
| history | 0.6049 | 49 / 81 |
| civics | 0.6324 | 43 / 68 |
##### GPT-4-turbo
Total: 293 / 387 (0.7571)
| Subject | Accuracy | correct / total |
|:------------- | -------- |:--------------- |
| chinese | 0.7826 | 72 / 92 |
| mathematics | 0.2500 | 6 / 24 |
| physics | 0.7368 | 14 / 19 |
| chemistry | 0.5417 | 13 / 24 |
| biology | 0.6875 | 11 / 16 |
| earth science | 0.8421 | 16 / 19 |
| geography | 0.8864 | 39 / 44 |
| history | 0.8519 | 69 / 81 |
| civics | 0.7794 | 53 / 68 |
##### Claude-instant-1
Total: 213 / 387 (0.5504)
| Subject | Accuracy | correct / total |
|:------------- | -------- |:--------------- |
| chinese | 0.4891 | 45 / 92 |
| mathematics | 0.2500 | 6 / 24 |
| physics | 0.3684 | 7 / 19 |
| chemistry | 0.3333 | 8 / 24 |
| biology | 0.5625 | 9 / 16 |
| earth science | 0.4211 | 8 / 19 |
| geography | 0.6818 | 30 / 44 |
| history | 0.7160 | 58 / 81 |
| civics | 0.6176 | 42 / 68 |
##### Claude-2
Total: 180 / 387 (0.4651)
| Subject | Accuracy | correct / total |
|:------------- | -------- |:--------------- |
| chinese | 0.3152 | 29 / 92 |
| mathematics | 0.2083 | 5 / 24 |
| physics | 0.3684 | 7 / 19 |
| chemistry | 0.2917 | 7 / 24 |
| biology | 0.1875 | 3 / 16 |
| earth science | 0.2632 | 5 / 19 |
| geography | 0.6818 | 30 / 44 |
| history | 0.6914 | 56 / 81 |
| civics | 0.5588 | 38 / 68 |
#### AST
##### ChatGPT
Total: 193 / 405 (0.4765)
| Subject | Accuracy | correct / total |
|:----------- | -------- |:--------------- |
| chinese | 0.4365 | 55 / 126 |
| mathematics | 0.1500 | 3 / 20 |
| physics | 0.2368 | 9 / 38 |
| chemistry | 0.2759 | 8 / 29 |
| biology | 0.7500 | 27 / 36 |
| geography | 0.5094 | 27 / 53 |
| history | 0.7843 | 40 / 51 |
| civics | 0.4615 | 24 / 52 |
##### GPT-4-turbo
Total: 280 / 405 (0.6914)
| Subject | Accuracy | correct / total |
|:----------- | -------- |:--------------- |
| chinese | 0.7302 | 92 / 126 |
| mathematics | 0.1500 | 3 / 20 |
| physics | 0.5263 | 20 / 38 |
| chemistry | 0.3103 | 9 / 29 |
| biology | 0.8889 | 32 / 36 |
| geography | 0.6981 | 37 / 53 |
| history | 0.9804 | 50 / 51 |
| civics | 0.7115 | 37 / 52 |
##### Claude-instant-1
Total: 219 / 405 (0.5407)
| Subject | Accuracy | correct / total |
|:----------- | -------- |:--------------- |
| chinese | 0.5635 | 71 / 126 |
| mathematics | 0.3500 | 7 / 20 |
| physics | 0.3947 | 15 / 38 |
| chemistry | 0.1724 | 5 / 29 |
| biology | 0.6389 | 23 / 36 |
| geography | 0.6038 | 32 / 53 |
| history | 0.6863 | 35 / 51 |
| civics | 0.5962 | 31 / 52 |
##### Claude-2
Total: 185 / 405 (0.4568)
| Subject | Accuracy | correct / total |
|:----------- | -------- |:--------------- |
| chinese | 0.4365 | 55 / 126 |
| mathematics | 0.0500 | 1 / 20 |
| physics | 0.3421 | 13 / 38 |
| chemistry | 0.1034 | 3 / 29 |
| biology | 0.4444 | 16 / 36 |
| geography | 0.6604 | 35 / 53 |
| history | 0.7255 | 37 / 51 |
| civics | 0.4808 | 25 / 52 |
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
dongyoung4091/shp_with_features_20k_flan_t5_large_flan_t5_large_zeroshot | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: post_id
dtype: string
- name: domain
dtype: string
- name: upvote_ratio
dtype: float64
- name: history
dtype: string
- name: c_root_id_A
dtype: string
- name: c_root_id_B
dtype: string
- name: created_at_utc_A
dtype: int64
- name: created_at_utc_B
dtype: int64
- name: score_A
dtype: int64
- name: score_B
dtype: int64
- name: human_ref_A
dtype: string
- name: human_ref_B
dtype: string
- name: labels
dtype: int64
- name: seconds_difference
dtype: float64
- name: score_ratio
dtype: float64
- name: helpfulness_A
dtype: float64
- name: helpfulness_B
dtype: float64
- name: specificity_A
dtype: float64
- name: specificity_B
dtype: float64
- name: intent_A
dtype: float64
- name: intent_B
dtype: float64
- name: factuality_A
dtype: float64
- name: factuality_B
dtype: float64
- name: easy-to-understand_A
dtype: float64
- name: easy-to-understand_B
dtype: float64
- name: relevance_A
dtype: float64
- name: relevance_B
dtype: float64
- name: readability_A
dtype: float64
- name: readability_B
dtype: float64
- name: enough-detail_A
dtype: float64
- name: enough-detail_B
dtype: float64
- name: biased:_A
dtype: float64
- name: biased:_B
dtype: float64
- name: fail-to-consider-individual-preferences_A
dtype: float64
- name: fail-to-consider-individual-preferences_B
dtype: float64
- name: repetetive_A
dtype: float64
- name: repetetive_B
dtype: float64
- name: fail-to-consider-context_A
dtype: float64
- name: fail-to-consider-context_B
dtype: float64
- name: too-long_A
dtype: float64
- name: too-long_B
dtype: float64
- name: __index_level_0__
dtype: int64
- name: log_score_A
dtype: float64
- name: log_score_B
dtype: float64
- name: zeroshot_helpfulness_A
dtype: float64
- name: zeroshot_helpfulness_B
dtype: float64
- name: zeroshot_specificity_A
dtype: float64
- name: zeroshot_specificity_B
dtype: float64
- name: zeroshot_intent_A
dtype: float64
- name: zeroshot_intent_B
dtype: float64
- name: zeroshot_factuality_A
dtype: float64
- name: zeroshot_factuality_B
dtype: float64
- name: zeroshot_easy-to-understand_A
dtype: float64
- name: zeroshot_easy-to-understand_B
dtype: float64
- name: zeroshot_relevance_A
dtype: float64
- name: zeroshot_relevance_B
dtype: float64
- name: zeroshot_readability_A
dtype: float64
- name: zeroshot_readability_B
dtype: float64
- name: zeroshot_enough-detail_A
dtype: float64
- name: zeroshot_enough-detail_B
dtype: float64
- name: zeroshot_biased:_A
dtype: float64
- name: zeroshot_biased:_B
dtype: float64
- name: zeroshot_fail-to-consider-individual-preferences_A
dtype: float64
- name: zeroshot_fail-to-consider-individual-preferences_B
dtype: float64
- name: zeroshot_repetetive_A
dtype: float64
- name: zeroshot_repetetive_B
dtype: float64
- name: zeroshot_fail-to-consider-context_A
dtype: float64
- name: zeroshot_fail-to-consider-context_B
dtype: float64
- name: zeroshot_too-long_A
dtype: float64
- name: zeroshot_too-long_B
dtype: float64
splits:
- name: train
num_bytes: 22674534
num_examples: 9459
- name: test
num_bytes: 22627412
num_examples: 9459
download_size: 12124964
dataset_size: 45301946
---
# Dataset Card for "shp_with_features_20k_flan_t5_large_flan_t5_large_zeroshot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Kunhao__pile-7b | ---
pretty_name: Evaluation run of Kunhao/pile-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kunhao/pile-7b](https://huggingface.co/Kunhao/pile-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kunhao__pile-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T14:02:00.215909](https://huggingface.co/datasets/open-llm-leaderboard/details_Kunhao__pile-7b/blob/main/results_2023-08-17T14%3A02%3A00.215909.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26607314141949256,\n\
\ \"acc_stderr\": 0.031950603341667064,\n \"acc_norm\": 0.2676071883857905,\n\
\ \"acc_norm_stderr\": 0.03196207703098002,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931572,\n \"mc2\": 0.4240744665255174,\n\
\ \"mc2_stderr\": 0.014948776413812296\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2380546075085324,\n \"acc_stderr\": 0.012445770028026203,\n\
\ \"acc_norm\": 0.26791808873720135,\n \"acc_norm_stderr\": 0.01294203019513643\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3269269069906393,\n\
\ \"acc_stderr\": 0.004681316064444439,\n \"acc_norm\": 0.3875721967735511,\n\
\ \"acc_norm_stderr\": 0.004862003566798543\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.035834961763610625,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.035834961763610625\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827842,\n\
\ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827842\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.20833333333333334,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.02767845257821239,\n\
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.02767845257821239\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707841,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707841\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27741935483870966,\n\
\ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.27741935483870966,\n\
\ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617722,\n\
\ \"acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617722\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.0340150671524904,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.0340150671524904\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"\
acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.31025641025641026,\n \"acc_stderr\": 0.023454674889404288,\n\
\ \"acc_norm\": 0.31025641025641026,\n \"acc_norm_stderr\": 0.023454674889404288\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958948,\n\
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958948\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.036313298039696525,\n \"\
acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.25871559633027524,\n \"acc_stderr\": 0.018776052319619624,\n \"\
acc_norm\": 0.25871559633027524,\n \"acc_norm_stderr\": 0.018776052319619624\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27941176470588236,\n \"acc_stderr\": 0.031493281045079556,\n \"\
acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.031493281045079556\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2109704641350211,\n \"acc_stderr\": 0.026558372502661923,\n \
\ \"acc_norm\": 0.2109704641350211,\n \"acc_norm_stderr\": 0.026558372502661923\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.28699551569506726,\n\
\ \"acc_stderr\": 0.030360379710291954,\n \"acc_norm\": 0.28699551569506726,\n\
\ \"acc_norm_stderr\": 0.030360379710291954\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.039849796533028725,\n \"\
acc_norm\": 0.256198347107438,\n \"acc_norm_stderr\": 0.039849796533028725\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.1794871794871795,\n\
\ \"acc_stderr\": 0.02514093595033545,\n \"acc_norm\": 0.1794871794871795,\n\
\ \"acc_norm_stderr\": 0.02514093595033545\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24648786717752236,\n\
\ \"acc_stderr\": 0.015411308769686941,\n \"acc_norm\": 0.24648786717752236,\n\
\ \"acc_norm_stderr\": 0.015411308769686941\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.02353292543104428,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.02353292543104428\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.01421957078810398,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.01421957078810398\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.023788583551658544,\n\
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.023788583551658544\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290413,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290413\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24902216427640156,\n\
\ \"acc_stderr\": 0.011044892264040772,\n \"acc_norm\": 0.24902216427640156,\n\
\ \"acc_norm_stderr\": 0.011044892264040772\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.03000856284500347,\n\
\ \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.03000856284500347\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24673202614379086,\n \"acc_stderr\": 0.0174408203674025,\n \
\ \"acc_norm\": 0.24673202614379086,\n \"acc_norm_stderr\": 0.0174408203674025\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.19090909090909092,\n\
\ \"acc_stderr\": 0.03764425585984924,\n \"acc_norm\": 0.19090909090909092,\n\
\ \"acc_norm_stderr\": 0.03764425585984924\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3877551020408163,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.3877551020408163,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n\
\ \"acc_stderr\": 0.030965903123573026,\n \"acc_norm\": 0.25870646766169153,\n\
\ \"acc_norm_stderr\": 0.030965903123573026\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n\
\ \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n\
\ \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2573099415204678,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.2573099415204678,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931572,\n \"mc2\": 0.4240744665255174,\n\
\ \"mc2_stderr\": 0.014948776413812296\n }\n}\n```"
repo_url: https://huggingface.co/Kunhao/pile-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|arc:challenge|25_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hellaswag|10_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T14:02:00.215909.parquet'
- config_name: results
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- results_2023-08-17T14:02:00.215909.parquet
- split: latest
path:
- results_2023-08-17T14:02:00.215909.parquet
---
# Dataset Card for Evaluation run of Kunhao/pile-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Kunhao/pile-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Kunhao/pile-7b](https://huggingface.co/Kunhao/pile-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kunhao__pile-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T14:02:00.215909](https://huggingface.co/datasets/open-llm-leaderboard/details_Kunhao__pile-7b/blob/main/results_2023-08-17T14%3A02%3A00.215909.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26607314141949256,
"acc_stderr": 0.031950603341667064,
"acc_norm": 0.2676071883857905,
"acc_norm_stderr": 0.03196207703098002,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931572,
"mc2": 0.4240744665255174,
"mc2_stderr": 0.014948776413812296
},
"harness|arc:challenge|25": {
"acc": 0.2380546075085324,
"acc_stderr": 0.012445770028026203,
"acc_norm": 0.26791808873720135,
"acc_norm_stderr": 0.01294203019513643
},
"harness|hellaswag|10": {
"acc": 0.3269269069906393,
"acc_stderr": 0.004681316064444439,
"acc_norm": 0.3875721967735511,
"acc_norm_stderr": 0.004862003566798543
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.035834961763610625,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.035834961763610625
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827842,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827842
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.02767845257821239,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.02767845257821239
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707841,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707841
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643898,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27741935483870966,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.27741935483870966,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.030315099285617722,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.030315099285617722
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.0340150671524904,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.0340150671524904
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31025641025641026,
"acc_stderr": 0.023454674889404288,
"acc_norm": 0.31025641025641026,
"acc_norm_stderr": 0.023454674889404288
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948492,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958948,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958948
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696525,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25871559633027524,
"acc_stderr": 0.018776052319619624,
"acc_norm": 0.25871559633027524,
"acc_norm_stderr": 0.018776052319619624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2109704641350211,
"acc_stderr": 0.026558372502661923,
"acc_norm": 0.2109704641350211,
"acc_norm_stderr": 0.026558372502661923
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.28699551569506726,
"acc_stderr": 0.030360379710291954,
"acc_norm": 0.28699551569506726,
"acc_norm_stderr": 0.030360379710291954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.1794871794871795,
"acc_stderr": 0.02514093595033545,
"acc_norm": 0.1794871794871795,
"acc_norm_stderr": 0.02514093595033545
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24648786717752236,
"acc_stderr": 0.015411308769686941,
"acc_norm": 0.24648786717752236,
"acc_norm_stderr": 0.015411308769686941
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.02353292543104428,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.02353292543104428
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.01421957078810398,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.01421957078810398
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.023788583551658544,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.023788583551658544
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290413,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290413
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24902216427640156,
"acc_stderr": 0.011044892264040772,
"acc_norm": 0.24902216427640156,
"acc_norm_stderr": 0.011044892264040772
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.03000856284500347,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.03000856284500347
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24673202614379086,
"acc_stderr": 0.0174408203674025,
"acc_norm": 0.24673202614379086,
"acc_norm_stderr": 0.0174408203674025
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.19090909090909092,
"acc_stderr": 0.03764425585984924,
"acc_norm": 0.19090909090909092,
"acc_norm_stderr": 0.03764425585984924
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3877551020408163,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.3877551020408163,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573026,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573026
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2573099415204678,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.2573099415204678,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931572,
"mc2": 0.4240744665255174,
"mc2_stderr": 0.014948776413812296
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
j-chim/pii-pile-chunk3-250000-300000-tagged | ---
dataset_info:
features:
- name: texts
sequence: string
- name: meta
struct:
- name: pile_set_name
dtype: string
- name: scores
sequence: float64
- name: avg_score
dtype: float64
- name: num_sents
dtype: int64
- name: tagged_pii_results
list:
- name: analysis_explanation
dtype: 'null'
- name: end
dtype: int64
- name: entity_type
dtype: string
- name: recognition_metadata
struct:
- name: recognizer_identifier
dtype: string
- name: recognizer_name
dtype: string
- name: score
dtype: float64
- name: start
dtype: int64
splits:
- name: train
num_bytes: 526454655
num_examples: 49999
download_size: 201949320
dataset_size: 526454655
---
# Dataset Card for "pii-pile-chunk3-250000-300000-tagged"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mikelue/ai-tube-jess | ---
license: cc-by-nc-sa-4.0
pretty_name: Jess
---
## Description
I explore the past so you don't have too!
## Prompt
A channel run by an influencer and videoblogger called Jess.
She often do weird challenges like "saying yes to everyone", "walking to corss the united states", "walk in new york dressed as a chicken" to get millions of views and likes.
She also sometimes give tips and advices for make-up, beauty, dating etc, but she now makes random videos
She is also a pro gamer, enjoying games like League of Legends, Fortnite, Call of Duty, The Sims, GTA 5, Baldur's Gate 3, but she now makes random videos |
tyzhu/find_second_sent_train_100_eval_10_baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 168555
num_examples: 100
- name: validation
num_bytes: 17349
num_examples: 10
download_size: 0
dataset_size: 185904
---
# Dataset Card for "find_second_sent_train_100_eval_10_baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ranjan22/autotrain-data-text2hashtag | ---
task_categories:
- summarization
---
# AutoTrain Dataset for project: text2hashtag
## Dataset Description
This dataset has been automatically processed by AutoTrain for project text2hashtag.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "An intense battle between moderates and progressives has already spilled into public view.",
"target": "['Presidential Election of 2020', 'Police Reform', 'United States Politics and Government', 'Black People', 'Blacks', 'Elections, House of Representatives', 'George Floyd Protests (2020)', 'Democratic Party', 'Democratic Socialists of America', 'Justice Democrats', 'Sunrise Movement']"
},
{
"text": "The blaze is setting off mines and other ordnance littering the war zone in eastern Ukraine, hampering already dangerous firefighting and evacuation efforts.",
"target": "['Wildfires', 'Explosions (Accidental)', 'Ukraine']"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 2399 |
| valid | 600 |
|
chrisgg1/keywords_verbinden5 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: label
dtype:
class_label:
names:
'0': eins
'1': ja
'2': nein
'3': verbinden
splits:
- name: train
num_bytes: 1036620540.45
num_examples: 7981
download_size: 592054581
dataset_size: 1036620540.45
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BhabhaAI/DEITA-Complexity | ---
license: apache-2.0
---
|
1234phim/hot | ---
license: unlicense
---
|
nateraw/us-accidents | ---
license:
- cc-by-nc-sa-4.0
kaggle_id: sobhanmoosavi/us-accidents
---
# Dataset Card for US Accidents (2016 - 2021)
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://kaggle.com/datasets/sobhanmoosavi/us-accidents
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
### Description
This is a countrywide car accident dataset, which covers __49 states of the USA__. The accident data are collected from __February 2016 to Dec 2021__, using multiple APIs that provide streaming traffic incident (or event) data. These APIs broadcast traffic data captured by a variety of entities, such as the US and state departments of transportation, law enforcement agencies, traffic cameras, and traffic sensors within the road-networks. Currently, there are about __2.8 million__ accident records in this dataset. Check [here](https://smoosavi.org/datasets/us_accidents) to learn more about this dataset.
### Acknowledgements
Please cite the following papers if you use this dataset:
- Moosavi, Sobhan, Mohammad Hossein Samavatian, Srinivasan Parthasarathy, and Rajiv Ramnath. “[A Countrywide Traffic Accident Dataset](https://arxiv.org/abs/1906.05409).”, 2019.
- Moosavi, Sobhan, Mohammad Hossein Samavatian, Srinivasan Parthasarathy, Radu Teodorescu, and Rajiv Ramnath. ["Accident Risk Prediction based on Heterogeneous Sparse Data: New Dataset and Insights."](https://arxiv.org/abs/1909.09638) In proceedings of the 27th ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems, ACM, 2019.
### Content
This dataset has been collected in real-time, using multiple Traffic APIs. Currently, it contains accident data that are collected from February 2016 to Dec 2021 for the Contiguous United States. Check [here](https://smoosavi.org/datasets/us_accidents) to learn more about this dataset.
### Inspiration
US-Accidents can be used for numerous applications such as real-time car accident prediction, studying car accidents hotspot locations, casualty analysis and extracting cause and effect rules to predict car accidents, and studying the impact of precipitation or other environmental stimuli on accident occurrence. The most recent release of the dataset can also be useful to study the impact of COVID-19 on traffic behavior and accidents.
### Usage Policy and Legal Disclaimer
This dataset is being distributed only for __Research__ purposes, under Creative Commons Attribution-Noncommercial-ShareAlike license (CC BY-NC-SA 4.0). By clicking on download button(s) below, you are agreeing to use this data only for non-commercial, research, or academic applications. You may need to cite the above papers if you use this dataset.
### Inquiries or need help?
For any inquiries, contact me at moosavi.3@osu.edu
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
This dataset was shared by [@sobhanmoosavi](https://kaggle.com/sobhanmoosavi)
### Licensing Information
The license for this dataset is cc-by-nc-sa-4.0
### Citation Information
```bibtex
[More Information Needed]
```
### Contributions
[More Information Needed] |
myunghwan/llama2ovm | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4887
num_examples: 29
download_size: 4190
dataset_size: 4887
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_cola_adj_postfix | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 10818
num_examples: 125
- name: test
num_bytes: 9805
num_examples: 119
- name: train
num_bytes: 76899
num_examples: 922
download_size: 51341
dataset_size: 97522
---
# Dataset Card for "MULTI_VALUE_cola_adj_postfix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
acloudfan/lunarian-fictious-language | ---
license: apache-2.0
---
This is a sample dataset for an exercise to try out training an existing pre-trained tokenizer.
Background:
Scientists have discovered a new tribe. This new tribe worship the celestial bodies (moon is their god).
The tribe uses a language referred to as 'lunarian'. Scientists have decided to build a LLM to help them communicate with the tribe.
The LLM will carry out lunarian-to-english translation.
Our task is to train an existing BERT based tokenizer. The new tokenizer will be used for training the LLM.
|
gilkeyio/AudioMNIST | ---
language:
- en
license: mit
size_categories:
- 10K<n<100K
task_categories:
- audio-classification
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: speaker_id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: digit
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
'7': '7'
'8': '8'
'9': '9'
- name: gender
dtype:
class_label:
names:
'0': male
'1': female
- name: accent
dtype: string
- name: age
dtype: int64
- name: native_speaker
dtype: bool
- name: origin
dtype: string
splits:
- name: train
num_bytes: 1493209727.0
num_examples: 24000
- name: test
num_bytes: 360966680.0
num_examples: 6000
download_size: 1483680961
dataset_size: 1854176407.0
---
# Dataset Card for "AudioMNIST"
The [audioMNIST](https://github.com/soerenab/AudioMNIST) dataset has 50 English recordings per digit (0-9) of 60 speakers.
There are 60 participants in total, with 12 being women and 48 being men, all featuring a diverse range of accents and country of origin. Their ages vary from 22 to 61 years old. This is a great dataset to explore a simple audio classification problem: either the digit or the gender.
## Bias, Risks, and Limitations
* The genders represented in the dataset are unbalanced, with around 80% being men.
* The majority of the speakers, around 70%, have a German accent
### Citation Information
The original creators of the dataset ask you to cite [their paper](https://arxiv.org/abs/1807.03418) if you use this data:
```
@ARTICLE{becker2018interpreting,
author = {Becker, S\"oren and Ackermann, Marcel and Lapuschkin, Sebastian and M\"uller, Klaus-Robert and Samek, Wojciech},
title = {Interpreting and Explaining Deep Neural Networks for Classification of Audio Signals},
journal = {CoRR},
volume = {abs/1807.03418},
year = {2018},
archivePrefix = {arXiv},
eprint = {1807.03418},
}
``` |
JorangHorse/SecondTest | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 1008242.0
num_examples: 2
download_size: 556516
dataset_size: 1008242.0
---
# Dataset Card for "SecondTest"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sanjay920/glaive-functions | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: tools
dtype: string
- name: system
dtype: string
splits:
- name: train
num_bytes: 204919508
num_examples: 100563
download_size: 87428559
dataset_size: 204919508
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-200453bd-7694966 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- masakhaner
eval_info:
task: entity_extraction
model: mbeukman/xlm-roberta-base-finetuned-swahili-finetuned-ner-luo
metrics: []
dataset_name: masakhaner
dataset_config: swa
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: mbeukman/xlm-roberta-base-finetuned-swahili-finetuned-ner-luo
* Dataset: masakhaner
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
persiannlp/parsinlu_translation_fa_en | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- fa
license:
- cc-by-nc-sa-4.0
multilinguality:
- fa
- en
size_categories:
- 1K<n<10K
source_datasets:
- extended
task_categories:
- translation
task_ids:
- translation
---
# Dataset Card for PersiNLU (Machine Translation)
## Table of Contents
- [Dataset Card for PersiNLU (Machine Translation)](#dataset-card-for-persi_nlu_machine_translation)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Github](https://github.com/persiannlp/parsinlu/)
- **Repository:** [Github](https://github.com/persiannlp/parsinlu/)
- **Paper:** [Arxiv](https://arxiv.org/abs/2012.06154)
- **Leaderboard:**
- **Point of Contact:** d.khashabi@gmail.com
### Dataset Summary
A Persian translation dataset (English -> Persian).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The text dataset is in Persian (`fa`) and English (`en`).
## Dataset Structure
### Data Instances
Here is an example from the dataset:
```json
{
"source": "چه زحمتها که بکشد تا منابع مالی را تامین کند اصطلاحات را ترویج کند نهادهایی به راه اندازد.",
"targets": ["how toil to raise funds, propagate reforms, initiate institutions!"],
"category": "mizan_dev_en_fa"
}
```
### Data Fields
- `source`: the input sentences, in Persian.
- `targets`: the list of gold target translations in English.
- `category`: the source from which the example is mined.
### Data Splits
The train/dev/test split contains 1,622,281/2,138/47,745 samples.
## Dataset Creation
### Curation Rationale
For details, check [the corresponding draft](https://arxiv.org/abs/2012.06154).
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
CC BY-NC-SA 4.0 License
### Citation Information
```bibtex
@article{huggingface:dataset,
title = {ParsiNLU: A Suite of Language Understanding Challenges for Persian},
authors = {Khashabi, Daniel and Cohan, Arman and Shakeri, Siamak and Hosseini, Pedram and Pezeshkpour, Pouya and Alikhani, Malihe and Aminnaseri, Moin and Bitaab, Marzieh and Brahman, Faeze and Ghazarian, Sarik and others},
year={2020}
journal = {arXiv e-prints},
eprint = {2012.06154},
}
```
### Contributions
Thanks to [@danyaljj](https://github.com/danyaljj) for adding this dataset.
|
cg1177/qvhighlight_internvideo2_videoclip_6b_w2s | ---
license: apache-2.0
---
|
nova-sqoin/hotel_dataset_llama2 | ---
license: apache-2.0
language:
- pt
- en
size_categories:
- n<1K
--- |
msaad02/formatted-ss-cleaned-brockport-qa | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2897094
num_examples: 7098
download_size: 828075
dataset_size: 2897094
---
# Dataset Card for "formatted-ss-cleaned-brockport-qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mteb/stackexchange-clustering-p2p | ---
language:
- en
--- |
DanielHesslow/SwissProt-GO | ---
language:
- protein sequences
datasets:
- Swissprot
tags:
- Protein
- Gene Ontology
- GO
---
Swissprot is a high quality manually annotated protein database. The dataset contains annotations with the functional properties of the proteins. Here we extract proteins with Gene Ontology labels.
The dataset is ported from Protinfer: https://github.com/google-research/proteinfer.
The GO-labels are extracted and indexed, the mapping is provided in `idx_mapping.json`. Proteins without GO tags are removed.
|
open-llm-leaderboard/details_psmathur__model_007_v2 | ---
pretty_name: Evaluation run of psmathur/model_007_v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [psmathur/model_007_v2](https://huggingface.co/psmathur/model_007_v2) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__model_007_v2_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-09T09:02:32.950364](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_007_v2_public/blob/main/results_2023-11-09T09-02-32.950364.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1636954697986577,\n\
\ \"em_stderr\": 0.0037891361135837117,\n \"f1\": 0.31382655201342385,\n\
\ \"f1_stderr\": 0.0038067833114928977,\n \"acc\": 0.5639691402386229,\n\
\ \"acc_stderr\": 0.011361388955682963\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.1636954697986577,\n \"em_stderr\": 0.0037891361135837117,\n\
\ \"f1\": 0.31382655201342385,\n \"f1_stderr\": 0.0038067833114928977\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.28658074298711145,\n \
\ \"acc_stderr\": 0.012454841668337704\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8413575374901342,\n \"acc_stderr\": 0.010267936243028223\n\
\ }\n}\n```"
repo_url: https://huggingface.co/psmathur/model_007_v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_09T09_02_32.950364
path:
- '**/details_harness|drop|3_2023-11-09T09-02-32.950364.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-09T09-02-32.950364.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_09T09_02_32.950364
path:
- '**/details_harness|gsm8k|5_2023-11-09T09-02-32.950364.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-09T09-02-32.950364.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_09T09_02_32.950364
path:
- '**/details_harness|winogrande|5_2023-11-09T09-02-32.950364.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-09T09-02-32.950364.parquet'
- config_name: results
data_files:
- split: 2023_11_09T09_02_32.950364
path:
- results_2023-11-09T09-02-32.950364.parquet
- split: latest
path:
- results_2023-11-09T09-02-32.950364.parquet
---
# Dataset Card for Evaluation run of psmathur/model_007_v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/model_007_v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/model_007_v2](https://huggingface.co/psmathur/model_007_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__model_007_v2_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-09T09:02:32.950364](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_007_v2_public/blob/main/results_2023-11-09T09-02-32.950364.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.1636954697986577,
"em_stderr": 0.0037891361135837117,
"f1": 0.31382655201342385,
"f1_stderr": 0.0038067833114928977,
"acc": 0.5639691402386229,
"acc_stderr": 0.011361388955682963
},
"harness|drop|3": {
"em": 0.1636954697986577,
"em_stderr": 0.0037891361135837117,
"f1": 0.31382655201342385,
"f1_stderr": 0.0038067833114928977
},
"harness|gsm8k|5": {
"acc": 0.28658074298711145,
"acc_stderr": 0.012454841668337704
},
"harness|winogrande|5": {
"acc": 0.8413575374901342,
"acc_stderr": 0.010267936243028223
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/gita_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of gita/ジータ (Granblue Fantasy)
This is the dataset of gita/ジータ (Granblue Fantasy), containing 500 images and their tags.
The core tags of this character are `short_hair, blonde_hair, brown_eyes, breasts, hairband, bangs, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 760.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gita_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 414.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gita_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1236 | 907.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gita_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 671.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gita_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1236 | 1.29 GiB | [Download](https://huggingface.co/datasets/CyberHarem/gita_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gita_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, collarbone, gauntlets, holding_sword, looking_at_viewer, pink_dress, short_sleeves, white_background, zettai_ryouiki, bow, brown_thighhighs, cleavage, hair_intakes, pink_skirt, puffy_sleeves, smile, solo, white_shirt, blush, closed_mouth, red_hairband, simple_background, thigh_boots, brown_footwear, cowboy_shot, open_mouth, pink_hairband, thighs, unsheathed, v-shaped_eyebrows |
| 1 | 9 |  |  |  |  |  | 1girl, bare_shoulders, blush, looking_at_viewer, solo, black_gloves, smile, red_necktie, upper_body, simple_background, white_background, stethoscope, labcoat, sleeveless |
| 2 | 7 |  |  |  |  |  | 1girl, pleated_skirt, witch_hat, looking_at_viewer, solo, thigh_boots, thighhighs, white_gloves, white_skirt, collarbone, puffy_short_sleeves, simple_background, blush, cleavage, neckerchief, shirt, smile, white_background, black_headwear, open_mouth, sailor_collar, staff, zettai_ryouiki |
| 3 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, bare_shoulders, white_dress, armlet, bracelet, cleavage, large_breasts, sideboob, veil, white_background, blush, yellow_eyes, covered_nipples, open_mouth, revealing_clothes |
| 4 | 16 |  |  |  |  |  | 1girl, alternate_costume, white_gloves, looking_at_viewer, puffy_short_sleeves, solo, skirt, hair_ribbon, open_mouth, blue_ribbon, hair_bow, blush, :d, one_eye_closed, simple_background, white_background |
| 5 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, sailor_collar, serafuku, solo, twin_braids, hair_bow, pleated_skirt, holding_bag, medium_hair, official_alternate_costume, pink_hairband, school_bag, simple_background, white_background, white_shirt, yellow_eyes, black_skirt, blush, petals, pink_neckerchief, puffy_short_sleeves, smile |
| 6 | 5 |  |  |  |  |  | 1girl, fur_trim, looking_at_viewer, paw_gloves, solo, upper_body, blush, cat_hood, hood_up, open_mouth, :d, fake_animal_ears, black_gloves, cat_ears, long_sleeves, simple_background, white_background |
| 7 | 10 |  |  |  |  |  | 1girl, midriff, navel, solo, hair_ornament, looking_at_viewer, crop_top, earrings, short_shorts, armor, black_hairband, cape, stomach, black_gloves, blush, cleavage_cutout, red_shorts, thighhighs, belt, closed_mouth, gauntlets, smile, thighs, ahoge, ear_piercing, groin, hair_between_eyes, holding, weapon |
| 8 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, red_eyes, solo, black_jacket, off_shoulder, x_hair_ornament, bare_shoulders, crop_top, midriff, navel, belt, black_choker, collarbone, earrings, simple_background, smile, tongue_out, fishnets, grey_hair, shorts, white_background, feather_boa, fruit, open_clothes, open_mouth, single_leg_pantyhose, skirt, black_nails, cleavage, holding, swept_bangs |
| 9 | 8 |  |  |  |  |  | blush, 1girl, bare_shoulders, cleavage, collarbone, looking_at_viewer, navel, official_alternate_costume, smile, solo, beach, open_mouth, bikini, hair_ornament, innertube, thigh_strap, day, large_breasts, one-piece_swimsuit, blue_sky, outdoors, thighs |
| 10 | 9 |  |  |  |  |  | 1girl, solo, hood_down, long_sleeves, looking_at_viewer, blue_eyes, blue_hair, hair_ornament, hooded_jacket, white_background, black_hairband, simple_background, smile, black_jacket, open_jacket, shirt, black_shorts, collarbone, jewelry, parted_lips, short_shorts, white_jacket |
| 11 | 5 |  |  |  |  |  | 1girl, blush, braid, floral_print, hair_flower, holding_food, looking_at_viewer, upper_body, yukata, cotton_candy, obi, solo, eating, open_mouth, pink_kimono, print_kimono, fireworks, night, white_kimono, wide_sleeves, yellow_eyes |
| 12 | 5 |  |  |  |  |  | 1girl, hair_flower, looking_at_viewer, obi, smile, solo, wide_sleeves, yukata, blush, floral_print, pink_kimono, upper_body, hair_bobbles, long_sleeves, print_kimono, twin_braids, aerial_fireworks, closed_mouth, food, holding, night_sky, open_mouth, outdoors, twitter_username, white_kimono, yellow_eyes |
| 13 | 25 |  |  |  |  |  | 1girl, fake_animal_ears, rabbit_ears, blush, looking_at_viewer, solo, hair_flower, wrist_cuffs, cape, playboy_bunny, rabbit_tail, smile, thighhighs, open_mouth, alternate_costume, white_leotard, cleavage, short_sleeves, simple_background, large_breasts, white_background |
| 14 | 8 |  |  |  |  |  | 1boy, 1girl, blush, hetero, solo_focus, mosaic_censoring, nipples, simple_background, open_mouth, white_background, completely_nude, penis, yellow_eyes, ass, collarbone, girl_on_top, large_breasts, navel, pink_hairband, pussy, sex, straddling |
| 15 | 5 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, solo_focus, bikini, open_mouth, paizuri, smile, blush, looking_at_viewer, pov, yellow_eyes, fang, gigantic_breasts, huge_breasts, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | collarbone | gauntlets | holding_sword | looking_at_viewer | pink_dress | short_sleeves | white_background | zettai_ryouiki | bow | brown_thighhighs | cleavage | hair_intakes | pink_skirt | puffy_sleeves | smile | solo | white_shirt | blush | closed_mouth | red_hairband | simple_background | thigh_boots | brown_footwear | cowboy_shot | open_mouth | pink_hairband | thighs | unsheathed | v-shaped_eyebrows | bare_shoulders | black_gloves | red_necktie | upper_body | stethoscope | labcoat | sleeveless | pleated_skirt | witch_hat | thighhighs | white_gloves | white_skirt | puffy_short_sleeves | neckerchief | shirt | black_headwear | sailor_collar | staff | white_dress | armlet | bracelet | large_breasts | sideboob | veil | yellow_eyes | covered_nipples | revealing_clothes | alternate_costume | skirt | hair_ribbon | blue_ribbon | hair_bow | :d | one_eye_closed | serafuku | twin_braids | holding_bag | medium_hair | official_alternate_costume | school_bag | black_skirt | petals | pink_neckerchief | fur_trim | paw_gloves | cat_hood | hood_up | fake_animal_ears | cat_ears | long_sleeves | midriff | navel | hair_ornament | crop_top | earrings | short_shorts | armor | black_hairband | cape | stomach | cleavage_cutout | red_shorts | belt | ahoge | ear_piercing | groin | hair_between_eyes | holding | weapon | red_eyes | black_jacket | off_shoulder | x_hair_ornament | black_choker | tongue_out | fishnets | grey_hair | shorts | feather_boa | fruit | open_clothes | single_leg_pantyhose | black_nails | swept_bangs | beach | bikini | innertube | thigh_strap | day | one-piece_swimsuit | blue_sky | outdoors | hood_down | blue_eyes | blue_hair | hooded_jacket | open_jacket | black_shorts | jewelry | parted_lips | white_jacket | braid | floral_print | hair_flower | holding_food | yukata | cotton_candy | obi | eating | pink_kimono | print_kimono | fireworks | night | white_kimono | wide_sleeves | hair_bobbles | aerial_fireworks | food | night_sky | twitter_username | rabbit_ears | wrist_cuffs | playboy_bunny | rabbit_tail | white_leotard | 1boy | hetero | solo_focus | mosaic_censoring | nipples | completely_nude | penis | ass | girl_on_top | pussy | sex | straddling | paizuri | pov | fang | gigantic_breasts | huge_breasts |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------------|:------------|:----------------|:--------------------|:-------------|:----------------|:-------------------|:-----------------|:------|:-------------------|:-----------|:---------------|:-------------|:----------------|:--------|:-------|:--------------|:--------|:---------------|:---------------|:--------------------|:--------------|:-----------------|:--------------|:-------------|:----------------|:---------|:-------------|:--------------------|:-----------------|:---------------|:--------------|:-------------|:--------------|:----------|:-------------|:----------------|:------------|:-------------|:---------------|:--------------|:----------------------|:--------------|:--------|:-----------------|:----------------|:--------|:--------------|:---------|:-----------|:----------------|:-----------|:-------|:--------------|:------------------|:--------------------|:--------------------|:--------|:--------------|:--------------|:-----------|:-----|:-----------------|:-----------|:--------------|:--------------|:--------------|:-----------------------------|:-------------|:--------------|:---------|:-------------------|:-----------|:-------------|:-----------|:----------|:-------------------|:-----------|:---------------|:----------|:--------|:----------------|:-----------|:-----------|:---------------|:--------|:-----------------|:-------|:----------|:------------------|:-------------|:-------|:--------|:---------------|:--------|:--------------------|:----------|:---------|:-----------|:---------------|:---------------|:------------------|:---------------|:-------------|:-----------|:------------|:---------|:--------------|:--------|:---------------|:-----------------------|:--------------|:--------------|:--------|:---------|:------------|:--------------|:------|:---------------------|:-----------|:-----------|:------------|:------------|:------------|:----------------|:--------------|:---------------|:----------|:--------------|:---------------|:--------|:---------------|:--------------|:---------------|:---------|:---------------|:------|:---------|:--------------|:---------------|:------------|:--------|:---------------|:---------------|:---------------|:-------------------|:-------|:------------|:-------------------|:--------------|:--------------|:----------------|:--------------|:----------------|:-------|:---------|:-------------|:-------------------|:----------|:------------------|:--------|:------|:--------------|:--------|:------|:-------------|:----------|:------|:-------|:-------------------|:---------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | | | | X | | | X | | | | | | | | X | X | | X | | | X | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | | | X | | | X | X | | | X | | | | X | X | | X | | | X | X | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | | X | | | X | | | | X | | | | X | X | | X | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 16 |  |  |  |  |  | X | | | | X | | | X | | | | | | | | | X | | X | | | X | | | | X | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | X | | | X | | | | | | | | X | X | X | X | | | X | | | | | X | | | | | | | | | | | X | | | | | X | | | | X | | | | | | | | X | | | | | | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | X | | | X | | | | | | | | | X | | X | | | X | | | | X | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 10 |  |  |  |  |  | X | | X | | X | | | | | | | | | | | X | X | | X | X | | | | | | | | X | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 12 |  |  |  |  |  | X | X | | | X | | | X | | | | X | | | | X | X | | | | | X | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | | X | X | | | | | | | | X | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 8 |  |  |  |  |  | X | X | | | X | | | | | | | X | | | | X | X | | X | | | | | | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 9 |  |  |  |  |  | X | X | | | X | | | X | | | | | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | X | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 5 |  |  |  |  |  | X | | | | X | | | | | | | | | | | | X | | X | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 5 |  |  |  |  |  | X | | | | X | | | | | | | | | | | X | X | | X | X | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | | X | | X | | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 13 | 25 |  |  |  |  |  | X | | | | X | | X | X | | | | X | | | | X | X | | X | | | X | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 14 | 8 |  |  |  |  |  | X | X | | | | | | X | | | | | | | | | | | X | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 15 | 5 |  |  |  |  |  | X | | | | X | | | X | | | | | | | | X | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | X | | | | | | | | X | X | X | X | X |
|
CyberHarem/chanzhi_neuralcloud | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of chanzhi/纏枝/缠枝 (Neural Cloud)
This is the dataset of chanzhi/纏枝/缠枝 (Neural Cloud), containing 20 images and their tags.
The core tags of this character are `long_hair, bangs, black_hair, yellow_eyes, ribbon, brown_eyes, hair_bun, hairband, hair_ribbon, breasts, brown_hair, double_bun, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 26.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chanzhi_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 14.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chanzhi_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 41 | 28.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chanzhi_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 23.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chanzhi_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 41 | 38.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chanzhi_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chanzhi_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, black_gloves, rifle, white_shirt, dress, fingerless_gloves, long_sleeves, black_footwear, full_body, open_mouth, shoes, skirt, holding_gun, open_coat, yellow_ribbon |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | solo | black_gloves | rifle | white_shirt | dress | fingerless_gloves | long_sleeves | black_footwear | full_body | open_mouth | shoes | skirt | holding_gun | open_coat | yellow_ribbon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:---------------|:--------|:--------------|:--------|:--------------------|:---------------|:-----------------|:------------|:-------------|:--------|:--------|:--------------|:------------|:----------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
HuangHaoyang/test1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: test1
dtype: image
splits:
- name: train
num_bytes: 441047.0
num_examples: 2
download_size: 440372
dataset_size: 441047.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
microsoft/kitab | ---
license: mit
configs:
- config_name: one-book-constraints
data_files:
- split: test
path: "data/KITAB-ONE-BOOK-CONSTRAINTS.json"
- config_name: two-book-constraints
data_files:
- split: test
path: "data/KITAB-TWO-BOOK-CONSTRAINTS.json"
- config_name: author-metadata
data_files:
- split: test
path: "data/KITAB-author-metadata.json"
config_names:
- one-book-constraints
- two-book-constraints
- author-metadata
---
## Overview
🕮 KITAB is a challenging dataset and a dynamic data collection approach for testing abilities of Large Language Models (LLMs) in answering information retrieval queries with constraint filters. A filtering query with constraints can be of the form `"List all books written by Toni Morrison that were published between 1970-1980"`. The dataset was originally contributed by the paper ["KITAB: Evaluating LLMs on Constraint Satisfaction for Information Retrieval"](https://arxiv.org/abs/2310.15511) Marah I Abdin, Suriya Gunasekar, Varun Chandrasekaran, Jerry Li, Mert Yuksekgonul, Rahee Ghosh Peshawaria, Ranjita Naik, and Besmira Nushi. 2023. The dataset is named after the word [kitab](https://en.wikipedia.org/wiki/Kitab), which is the word for "book" in Arabic, Swahili, Urdu, Hindi and various Indian and Turkic languages.
KITAB consists of book-related data across more than 600 authors and 13,000 queries with varying number of constraints and complexity. In each query in the dataset, the first constraint is always fixed to an author and the following can vary among the following types of book constraints to test for different constraint satisfaction capabilities:
- lexical (title starts or ends with a letter, word count in title)
- temporal (published between start and end year)
- named entity (city or human name present or not present in title)
## What is available in this repository?
This repository contains the following artifacts:
- All data for the KITAB sample used in the original paper. This consists of the set of authors, their corresponding books, and the set of queries with constraints.
- Example code for generating a new sample with a different set of authors. Here the sampling and data collection steps do not include the generation of queries as these may change according to the evaluation usage needs for the data. The example code also shows how to evaluate a potential model output with a list of books against the provided ground truth in KITAB, by following the same evaluation process as in the original paper. Note that this evaluation tends to relax some of the constraint satisfaction requirements in particular when the model may come up with only a partial title.
- All prompts that were used in the original paper to evaluate GPT-4 and GPT-3.5.
## Data
- [KITAB-ONE-BOOK-CONSTRAINTS.json](./data/KITAB-ONE-BOOK-CONSTRAINTS.json) and [KITAB-TWO-BOOK-CONSTRAINTS.json](./data/KITAB-TWO-BOOK-CONSTRAINTS.json) - correspond to queries with one and two book constraints. Each file has all the sufficient information that can be used to recreate a prompt query including the author, their birth year, number of sitelinks on WikiData, the constraint type(s), the constraint(s) expressed in natural language, the list of all books by the author, and the mapped list of books by the author that satisfy the constraint(s).
```
KITAB-ONE-BOOK-CONSTRAINTS_features = {
"Author": "author name",
"Birth Year": "author birth year",
"# of sitelinks": "number of external links related to the author",
"constraint_id": "unique id for the constraint",
"constraint_type": "type of the constraint",
"constraints": "the constraint",
"mapped_books": "list of books by the author mapped to the constraint",
"all_books": "full list of books by author post cleaning from openlibrary",
"raw_books": "raw list of books by author from openlibrary",
}
```
- [KITAB-author-metadata.json](./data/KITAB-author-metadata.json) - contains the set of 611 authors along with their birth year, the number of sitelinks in Wikidata, and their corresponding Open Library and WikiData identifiers.
- [KITAB-book-metadata.tar.gz](./data/KITAB-book-metadata.tar.gz) - contains a json file per author with all books retrieved from OpenLibrary for that author. The files contain the following information per title: the Open Library Id for the book, the Wikidata ID (if it exists), list of languages in which it was published, number of editions, number of words in the title, the earliest publishing year, city names found in the title (if any), a modified version of the title in lowercase that stripes stop words like "A" and "The" from the title, a set of of other redundant versions of the same title as found in Open Library (if any).
## Code and evaluation scripts
Example notebooks included in this repository:
- [collect_authors_from_wikidata.py](./code/data_sampling/collect_authors_from_wikidata.py) and [wikidata_open_library_author_profiling.ipynb](./code/data_sampling/wikidata_open_library_author_profiling.ipynb) - example code for generating a new author sample from WikiData and OpenLibrary. Here, we also make available the longer list of authors that was originally sampled from WikiData to facilitate the sampling process although future work may also choose to repeat this step as needed. The full list can be found in: [wikidata_authors_crawl.csv](./code/data_sampling/wikidata_authors_crawl.csv).
- [fetch_book_data.py](./code/data_sampling/fetch_book_data.py) - example code for collecting book data for the set of authors sampled in the previous steps. Pulls data from OpenLibrary and WikiData to curate and clean the sample.
- [evaluation.ipynb](./code/evaluation.ipynb) - example code for evaluating model outputs from our [prompts](./prompts/) against ground truth KITAB data. Here, we also make available the GPT-4 output on human name detection, although as models improve future work may also choose to repeat this step as needed. Results can be found in: [gpt_4_name_data_processed.csv](./code/utils/gpt_4_name_data_processed.csv).
## Prompts
We use the following prompt templates for different experimental conditions on the KITAB data:
[**ALL-BOOKS**]() \([Template 1](./prompts/Template_1.md)\): List all books from the author. This condition enables us to estimate an upper bound of model performance in retrieving relevant information for all queries, regardless of other constraints.
[**NO-CONTEXT**]() \([Template 2a](./prompts/Template_2a.md)\): List all books from the author that also satisfy other book constraints.
[**WITH-CONTEXT**]() \([Template 2b](./prompts/Template_2b.md)\): First, provide a full list of books from the author as input context to the model. Then, ask the model to list all books from the author that also satisfy other book constraints.
[**SELF-CONTEXT**]() \([Template 3](./prompts/Template_3.md)\): Ask the model to first self-retrieve all books from the author, and then use that list to find those that also satisfy book constraints.
[**NAME-CHECK**]() \([Template 4](./prompts/Template_4.md)\): Ask the model to find all book in a given list that contain a human name.
## Data Collection and Statistics
The author list was initially randomly sampled from [WikiData](https://www.wikidata.org/) and then filtered down to 611 authors to avoid potentially inaccurate data and extreme outliers. For example, this involved removing authors that have very few or too many books and authors that were born before 1850. The collected book data was derived from [Open Library](https://openlibrary.org/) and contains all books from the author that are tagged to be in English by Open Library or detected to be in English by the Language Detection service from the [Azure Cognitive Services API](https://learn.microsoft.com/en-us/azure/ai-services/language-service/language-detection/overview). More details about author sampling and book data collection and cleaning are present in the paper.
Since there exists a large number of constraint instances depending on their cardinality, we subsample from the potential large set of queries in a way that ensures a balanced representation across constraint types, and a variety of constraints that have different constrainedness (i.e., defined as the complement of the ratio between the number of books that satisfy the constraints with the total number of all books from the author). The dataset also contains “unsatisfiable” constraints, which do not match any book titles in our data. This constitutes 7.99% of the queries with only one book constraint. The final dataset contains 8239 single-constraint queries and 4750 double-constraint queries. The table below shows how these queries are distributed across different constraint types. For all double-constraint queries, both constraints are individually satisfiable and generated by combining our single constraint data. Only 0.76% of the queries are jointly unsatisfiable across both constraints.
<aside>
<center>
<style type="text/css">
.tg {border-collapse:collapse;border-color:#ccc;border-spacing:0;border-style:solid;border-width:1px;}
.tg td{background-color:#fff;border-color:#ccc;border-style:solid;border-width:0px;color:#333;
font-family:Arial, sans-serif;font-size:14px;overflow:hidden;padding:10px 5px;word-break:normal;}
.tg th{background-color:#50B49A;border-color:#ccc;border-style:solid;border-width:0px;color:#333;
font-family:Arial, sans-serif;font-size:14px;font-weight:normal;overflow:hidden;padding:10px 5px;word-break:normal;color:white}
.tg .tg-m5nv{border-color:#cccccc;text-align:center;vertical-align:top}
.tg .tg-x9uu{border-color:#cccccc;font-weight:bold;text-align:center;vertical-align:top}
.tg .tg-2bev{border-color:#cccccc;text-align:left;vertical-align:top}
.tg .tg-3cmc{border-color:#cccccc;text-align:right;vertical-align:top}
</style>
<table class="tg">
<caption>KITAB statistics on constraint frequency and average constrainedness. Two book constraint queries have more than one constraint type.
<br>
Constrainedness is defined as the complement of the ratio between the number of solutions S that satisfy the constraint and the total number of items in the domain N (higher constrainedness, more complex), i.e., κ = 1 - S/N.
</caption>
<thead>
<tr>
<th class="tg-m5nv"></th>
<th class="tg-x9uu" colspan="2">One book constraints</th>
<th class="tg-x9uu" colspan="2">Two book constraints</th>
</tr>
<tr>
<th class="tg-m5nv"><span style="font-weight:bold">Constraint Type</span></th>
<th class="tg-m5nv"><span style="font-weight:bold"># queries</span></td>
<th class="tg-x9uu"><span style="font-weight:bold">constrainedness</span></td>
<th class="tg-x9uu"><span style="font-weight:bold"># queries</span></td>
<th class="tg-x9uu"><span style="font-weight:bold">constrainedness</span></td>
</tr>
</thead>
<tbody>
<colgroup>
<col style="width: 120px">
<col style="width: 80px">
<col style="width: 100px">
<col style="width: 80px">
<col style="width: 100px">
</colgroup>
<tr>
<td class="tg-2bev">starts-with</td>
<td class="tg-3cmc">598</td>
<td class="tg-3cmc">0.90</td>
<td class="tg-3cmc">2163</td>
<td class="tg-3cmc">0.92</td>
</tr>
<tr>
<td class="tg-2bev">ends-with</td>
<td class="tg-3cmc">482</td>
<td class="tg-3cmc">0.89</td>
<td class="tg-3cmc">1782</td>
<td class="tg-3cmc">0.91</td>
</tr>
<tr>
<td class="tg-2bev">word-count</td>
<td class="tg-3cmc">1672</td>
<td class="tg-3cmc">0.53</td>
<td class="tg-3cmc">1630</td>
<td class="tg-3cmc">0.81</td>
</tr>
<tr>
<td class="tg-2bev">human-name</td>
<td class="tg-3cmc">611</td>
<td class="tg-3cmc">0.77</td>
<td class="tg-3cmc">292</td>
<td class="tg-3cmc">0.89</td>
</tr>
<tr>
<td class="tg-2bev">no-human-name</td>
<td class="tg-3cmc">611</td>
<td class="tg-3cmc">0.23</td>
<td class="tg-3cmc">801</td>
<td class="tg-3cmc">0.78</td>
</tr>
<tr>
<td class="tg-2bev">city-name</td>
<td class="tg-3cmc">611</td>
<td class="tg-3cmc">0.92</td>
<td class="tg-3cmc">197</td>
<td class="tg-3cmc">0.81</td>
</tr>
<tr>
<td class="tg-2bev">no-city-name</td>
<td class="tg-3cmc">611</td>
<td class="tg-3cmc">0.08</td>
<td class="tg-3cmc">831</td>
<td class="tg-3cmc">0.77</td>
</tr>
<tr>
<td class="tg-2bev">publishing-year</td>
<td class="tg-3cmc">3043</td>
<td class="tg-3cmc">0.80</td>
<td class="tg-3cmc">1804</td>
<td class="tg-3cmc">0.89</td>
</tr>
<tr>
<td class="tg-2bev">Summary</td>
<td class="tg-3cmc">8239</td>
<td class="tg-3cmc">0.67</td>
<td class="tg-3cmc">4750</td>
<td class="tg-3cmc">0.87</td>
</tr>
</tbody>
</table>
</center>
<br><br>
</aside>
<figure><center>
<img src="figures/popularity_wide.png" width="1000">
<figcaption>Distribution of KITAB queries across author popularity as measured by the number of sitelinks on Wikidata,
for queries with a single book constraint (left) and two book constraints (right).</figcaption>
</center>
</figure>
<figure><center>
<img src="figures/constrainedness_wide.png" width="1000">
<figcaption>Distribution of queries across author constrainedness as measured by the complement of the ratio
between the number of books that satisfy the book constraints and the total number of books from the author.
Distribution is shown for queries with a single book constraint (left) and two book constraints (right). Note
that most of the distribution in the lower range of constrainedness is dominated by constraints that require no
human name or no city name in the title, which are naturally easier to satisfy.</figcaption></center>
</figure>
## Responsible AI Considerations
*Data Cleaning*: Despite our best efforts in collecting a complete and accurate set of books, we also faced a variety of challenges in retrieval and cleaning, which we further describe in Appendix C.1 in the paper. To estimate the extent of which potential data cleaning issues may impact the data quality of KITAB and further evaluation, we also undertook a manual data annotation exercise during which we searched on the web for titles provided by GPT4 and GPT3.5 but that were marked as not from the author in our dataset. In summary, we find that based on a manual annotation of a subsample of queries, less than 5% of the queries to GPT4 and less than 6% of the queries to GPT3.5 may potentially be affected by cases where the model finds a book title that is not in KITAB and that will consequentially be marked as not from the author during our evaluation. While this can be remediated by using further data sources, the impact of missing information on model comparison is minor.
*Human Names*: Entity recognition for human names was done using both [Azure Cognitive Services API](https://learn.microsoft.com/en-us/azure/ai-services/language-service/language-detection/overview) and GPT4 (Template 4 in Appendix D in the paper), as we found the two approaches to be complementary for detecting names from different cultures. Note that even after using both these resources, there may still be names that are not recognized by either of these APIs, which is a testimony that more work is required in improving the quality of service of entity recognition for fairness across different languages and cultures.
*City Names*: For city names, we use [Azure Cognitive Services API](https://learn.microsoft.com/en-us/azure/ai-services/language-service/named-entity-recognition/overview) along with [Geonames](https://public.opendatasoft.com/explore/dataset/geonames-all-cities-with-a-population-1000), a database of cities with more than 1000 inhabitants.
*Author representation*: The list of authors in KITAB was sampled randomly from a large set of authors present in Open Library. We see that the rate of irrelevant information generated by current models increases with a lower number of sitelinks in Wikidata. Since the number of sitelinks may also correlate with the age (birth year) of the author or even their nationality and how well their community is linked to the World Wide Web, this observation has important implications on model quality of service across different geographical regions and author popularity and age. While KITAB naturally does contain more authors with a lower number of sitelinks (as indicated by its long-tail distribution of author count vs. their popularity), future fairness measurement investigations in this regard may also need to oversample explicitly from cohorts belonging to given demographic and geographical attributes.
## State-of-the-art results on KITAB
<aside>
<center>
<style type="text/css">
.tg {border-collapse:collapse;border-spacing:0;}
.tg td{border-color:black;border-style:solid;border-width:1px;font-family:Arial, sans-serif;font-size:14px;
overflow:hidden;padding:10px 5px;word-break:normal;}
.tg th{border-color:black;border-style:solid;border-width:1px;font-family:Arial, sans-serif;font-size:14px;
font-weight:normal;overflow:hidden;padding:10px 5px;word-break:normal;}
.tg .tg-qwh1{border-color:#cccccc;font-weight:bold;text-align:left;vertical-align:top}
.tg .tg-omta{background-color:#50b49a;border-color:#cccccc;color:#ffffff;text-align:left;vertical-align:top}
.tg .tg-h4uz{background-color:#50b49a;border-color:#cccccc;color:#ffffff;font-weight:bold;text-align:center;vertical-align:top}
.tg .tg-tr5t{border-color:#cccccc;text-align:right;vertical-align:top}
</style>
<table class="tg" style="undefined;table-layout: fixed; width: 675px">
<colgroup>
<col style="width: 87.130435px">
<col style="width: 42px">
<col style="width: 42px">
<col style="width: 42px">
<col style="width: 42px">
<col style="width: 42px">
<col style="width: 42px">
<col style="width: 42px">
<col style="width: 42px">
<col style="width: 42px">
<col style="width: 42px">
<col style="width: 42px">
<col style="width: 42px">
<col style="width: 42px">
<col style="width: 42px">
<col style="width: 42px">
</colgroup>
<thead>
<tr>
<th class="tg-omta" rowspan="2"></th>
<th class="tg-h4uz" colspan="3" rowspan="2">Irrelevant Information ↓</th>
<th class="tg-h4uz" colspan="6">Relevant Information<br>(Books from the author)</th>
<th class="tg-h4uz" colspan="3" rowspan="2">Completeness ↑ </th>
<th class="tg-h4uz" colspan="3" rowspan="2">All Correct ↑ </th>
</tr>
<tr>
<th class="tg-h4uz" colspan="3">Satisfied ↑ </th>
<th class="tg-h4uz" colspan="3">Unsatisfied ↓</th>
</tr>
</thead>
<tbody>
<tr>
<td class="tg-qwh1">GPT-4</td>
<td class="tg-tr5t">0.26</td>
<td class="tg-tr5t">0.33</td>
<td class="tg-tr5t">0.00</td>
<td class="tg-tr5t">0.51</td>
<td class="tg-tr5t">0.49</td>
<td class="tg-tr5t">0.78</td>
<td class="tg-tr5t">0.24</td>
<td class="tg-tr5t">0.19</td>
<td class="tg-tr5t">0.21</td>
<td class="tg-tr5t">0.24</td>
<td class="tg-tr5t">0.26</td>
<td class="tg-tr5t">0.70</td>
<td class="tg-tr5t">0.08</td>
<td class="tg-tr5t">0.08</td>
<td class="tg-tr5t">0.31</td>
</tr>
<tr>
<td class="tg-qwh1">GPT-3.5</td>
<td class="tg-tr5t">0.20</td>
<td class="tg-tr5t">0.44</td>
<td class="tg-tr5t">0.00</td>
<td class="tg-tr5t">0.44</td>
<td class="tg-tr5t">0.26</td>
<td class="tg-tr5t">0.68</td>
<td class="tg-tr5t">0.36</td>
<td class="tg-tr5t">0.30</td>
<td class="tg-tr5t">0.32</td>
<td class="tg-tr5t">0.16</td>
<td class="tg-tr5t">0.16</td>
<td class="tg-tr5t">0.47</td>
<td class="tg-tr5t">0.07</td>
<td class="tg-tr5t">0.02</td>
<td class="tg-tr5t">0.15</td>
</tr>
</tbody>
<caption>Aggregated model performance on KITAB for three experimental conditions <br>
NO-CONTEXT | SELF-CONTEXT | WITH-CONTEXT} (see definitions in the prompts section) <br> for queries requesting a list of books from a given author satisfying one additional book constraint. Both models have high rates of irrelevant information and poor constraint satisfaction across the board. Context availability mitigates irrelevant information rate, but constraint satisfaction still remains low. Full correctness (i.e., perfect match of the post-processed model output and the ground truth) is strikingly low across all conditions and models but there is visible improvement for WITH-CONTEXT.</caption>
</table>
</center>
</aside>
## How to cite
<pre>
@inproceedings{abdin2023kitab,
title={KITAB: Evaluating LLMs on Constraint Satisfaction for Information Retrieval},
author={Abdin, Marah I and Gunasekar, Suriya and Chandrasekaran, Varun and Li, Jerry and Yuksekgonul, Mert and Peshawaria, Rahee Ghosh and Naik, Ranjita and Nushi, Besmira},
journal={arXiv preprint arXiv:2310.15511},
year={2023}
}
</pre>
## Contributors
[Marah I Abdin](https://www.linkedin.com/in/marah-abdin/), [Suriya Gunasekar](https://sgunasekar.github.io/), [Varun Chandrasekaran](https://ece.illinois.edu/about/directory/faculty/varunc), [Jerry Li](https://jerryzli.github.io/), [Mert Yuksekgonul](https://mertyg.github.io/), [Rahee Ghosh Peshawaria](https://www.linkedin.com/in/rahee-ghosh-peshawaria/), [Ranjita Naik](https://github.com/ranjita-naik), [Besmira Nushi](https://besmiranushi.com/) |
joaosanches/literatura_brasileira | ---
dataset_info:
features:
- name: O_Alienista
dtype: string
- name: Cinco_Minutos
dtype: string
- name: O_Mulato
dtype: string
- name: Clara_dos_Anjos
dtype: string
splits:
- name: train
num_bytes: 1129086
num_examples: 1
download_size: 698786
dataset_size: 1129086
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
stoddur/med_chat_new_med_list | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 766347120.0
num_examples: 248814
download_size: 11052925
dataset_size: 766347120.0
---
# Dataset Card for "med_chat_new_med_list"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mteb/mtop_domain | ---
task_categories:
- text-classification
language:
- de
- en
- es
- fr
- hi
- th
--- |
khaled123/MathReasoning | ---
task_categories:
- table-question-answering
tags:
- code
size_categories:
- 1K<n<10K
--- |
JJJJKKKKK/MMMMMM | ---
license: afl-3.0
---
|
presencesw/cot-collection_v1 | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: rationale
dtype: string
- name: task
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 2018671136.9203568
num_examples: 1795948
download_size: 1237044847
dataset_size: 2018671136.9203568
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_160m_bo16_2_mix_50_kl_0.1_prm_160m_thr_0.1_seed_2 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43761514
num_examples: 18928
- name: epoch_1
num_bytes: 44353908
num_examples: 18928
- name: epoch_2
num_bytes: 44404430
num_examples: 18928
- name: epoch_3
num_bytes: 44439675
num_examples: 18928
- name: epoch_4
num_bytes: 44440654
num_examples: 18928
- name: epoch_5
num_bytes: 44427417
num_examples: 18928
- name: epoch_6
num_bytes: 44412442
num_examples: 18928
- name: epoch_7
num_bytes: 44403813
num_examples: 18928
- name: epoch_8
num_bytes: 44399234
num_examples: 18928
- name: epoch_9
num_bytes: 44393284
num_examples: 18928
- name: epoch_10
num_bytes: 44392194
num_examples: 18928
- name: epoch_11
num_bytes: 44392501
num_examples: 18928
- name: epoch_12
num_bytes: 44390723
num_examples: 18928
- name: epoch_13
num_bytes: 44388727
num_examples: 18928
- name: epoch_14
num_bytes: 44389016
num_examples: 18928
- name: epoch_15
num_bytes: 44390645
num_examples: 18928
- name: epoch_16
num_bytes: 44387804
num_examples: 18928
- name: epoch_17
num_bytes: 44389529
num_examples: 18928
- name: epoch_18
num_bytes: 44388306
num_examples: 18928
- name: epoch_19
num_bytes: 44388996
num_examples: 18928
- name: epoch_20
num_bytes: 44389089
num_examples: 18928
- name: epoch_21
num_bytes: 44389028
num_examples: 18928
- name: epoch_22
num_bytes: 44389179
num_examples: 18928
- name: epoch_23
num_bytes: 44389162
num_examples: 18928
- name: epoch_24
num_bytes: 44389152
num_examples: 18928
- name: epoch_25
num_bytes: 44388518
num_examples: 18928
- name: epoch_26
num_bytes: 44388843
num_examples: 18928
- name: epoch_27
num_bytes: 44389070
num_examples: 18928
- name: epoch_28
num_bytes: 44388794
num_examples: 18928
- name: epoch_29
num_bytes: 44389924
num_examples: 18928
download_size: 701264739
dataset_size: 1331225571
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
Waster3012/nlp-fever-results | ---
license: unknown
dataset_info:
features:
- name: claim
dtype: string
- name: label
dtype: string
- name: page_extraction
sequence: string
- name: ranked_sentences
sequence: string
splits:
- name: paper_test
num_bytes: 29468724
num_examples: 9876
download_size: 12363062
dataset_size: 29468724
---
|
vwxyzjn/openhermes-dev__kaist-ai_prometheus-13b-v1.0__1707408224 | ---
dataset_info:
features:
- name: model
dtype: 'null'
- name: category
dtype: string
- name: language
dtype: string
- name: custom_instruction
dtype: bool
- name: id
dtype: string
- name: topic
dtype: string
- name: avatarUrl
dtype: 'null'
- name: idx
dtype: 'null'
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: 'null'
- name: system_prompt
dtype: string
- name: source
dtype: string
- name: model_name
dtype: string
- name: skip_prompt_formatting
dtype: bool
- name: title
dtype: string
- name: hash
dtype: 'null'
- name: views
dtype: 'null'
- name: prompt
dtype: string
- name: token_length
dtype: int64
- name: candidate0
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate1
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate0_policy
dtype: string
- name: candidate1_policy
dtype: string
- name: llm_as_a_judge_prompt
dtype: string
- name: completion0
dtype: string
- name: candidate0_score
dtype: float64
- name: completion1
dtype: string
- name: candidate1_score
dtype: float64
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen_policy
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected_policy
dtype: string
splits:
- name: train_prefs
num_bytes: 3261799
num_examples: 167
download_size: 1763982
dataset_size: 3261799
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
---
|
CyberHarem/mima_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mima/魅魔 (Touhou)
This is the dataset of mima/魅魔 (Touhou), containing 500 images and their tags.
The core tags of this character are `green_hair, long_hair, hat, green_eyes, wizard_hat, bow, ribbon, breasts, ghost_tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 414.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mima_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 300.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mima_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 892 | 525.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mima_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 389.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mima_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 892 | 646.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mima_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mima_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, blue_capelet, blue_headwear, blue_skirt, holding_staff, long_sleeves, looking_at_viewer, solo, frills, smile, white_shirt, yellow_bowtie, blue_vest, crescent, closed_mouth, parted_bangs, very_long_hair, white_ribbon |
| 1 | 19 |  |  |  |  |  | 1girl, solo, staff, capelet, crescent, dress, smile, star_(symbol) |
| 2 | 6 |  |  |  |  |  | 1girl, blue_capelet, smile, solo, staff |
| 3 | 6 |  |  |  |  |  | 1girl, blue_capelet, holding_staff, long_sleeves, looking_at_viewer, solo, blue_dress, bowtie, parted_bangs, smile, star_(symbol), blue_headwear, crescent_print, closed_mouth, demon_wings, frilled_dress, purple_cape, simple_background, very_long_hair, white_ribbon |
| 4 | 10 |  |  |  |  |  | 1girl, blue_capelet, large_breasts, hair_intakes, hair_ribbon, solo, underboob, no_headwear, white_ribbon, blue_skirt, looking_at_viewer, closed_mouth, v-shaped_eyebrows, smile, upper_body |
| 5 | 6 |  |  |  |  |  | 1girl, blue_sailor_collar, blue_skirt, solo, white_shirt, red_neckerchief, capelet, holding_knife, looking_at_viewer, blood_on_knife, short_sleeves, white_headwear |
| 6 | 5 |  |  |  |  |  | large_breasts, nipples, nude, 1girl, censored, convenient_censoring, solo, ass, breast_hold, multiple_girls, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_capelet | blue_headwear | blue_skirt | holding_staff | long_sleeves | looking_at_viewer | solo | frills | smile | white_shirt | yellow_bowtie | blue_vest | crescent | closed_mouth | parted_bangs | very_long_hair | white_ribbon | staff | capelet | dress | star_(symbol) | blue_dress | bowtie | crescent_print | demon_wings | frilled_dress | purple_cape | simple_background | large_breasts | hair_intakes | hair_ribbon | underboob | no_headwear | v-shaped_eyebrows | upper_body | blue_sailor_collar | red_neckerchief | holding_knife | blood_on_knife | short_sleeves | white_headwear | nipples | nude | censored | convenient_censoring | ass | breast_hold | multiple_girls | open_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:----------------|:-------------|:----------------|:---------------|:--------------------|:-------|:---------|:--------|:--------------|:----------------|:------------|:-----------|:---------------|:---------------|:-----------------|:---------------|:--------|:----------|:--------|:----------------|:-------------|:---------|:-----------------|:--------------|:----------------|:--------------|:--------------------|:----------------|:---------------|:--------------|:------------|:--------------|:--------------------|:-------------|:---------------------|:------------------|:----------------|:-----------------|:----------------|:-----------------|:----------|:-------|:-----------|:-----------------------|:------|:--------------|:-----------------|:-------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 19 |  |  |  |  |  | X | | | | | | | X | | X | | | | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | | | | | | X | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | | X | X | X | X | | X | | | | | X | X | X | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | X | | X | | | X | X | | X | | | | | X | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | | X | | | X | X | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
autoevaluate/autoeval-eval-futin__guess-vi_3-3e6f1a-2087867179 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/guess
eval_info:
task: text_zero_shot_classification
model: bigscience/bloomz-560m
metrics: []
dataset_name: futin/guess
dataset_config: vi_3
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloomz-560m
* Dataset: futin/guess
* Config: vi_3
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
open-llm-leaderboard/details_tyson0420__stack_codellama-7b-inst | ---
pretty_name: Evaluation run of tyson0420/stack_codellama-7b-inst
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [tyson0420/stack_codellama-7b-inst](https://huggingface.co/tyson0420/stack_codellama-7b-inst)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tyson0420__stack_codellama-7b-inst\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-14T13:18:15.759578](https://huggingface.co/datasets/open-llm-leaderboard/details_tyson0420__stack_codellama-7b-inst/blob/main/results_2024-02-14T13-18-15.759578.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.39779725857115605,\n\
\ \"acc_stderr\": 0.034197794443939396,\n \"acc_norm\": 0.4011078418565935,\n\
\ \"acc_norm_stderr\": 0.03495973121827962,\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752323,\n \"mc2\": 0.39026734427166393,\n\
\ \"mc2_stderr\": 0.01459951299615118\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.39078498293515357,\n \"acc_stderr\": 0.01425856388051378,\n\
\ \"acc_norm\": 0.4351535836177474,\n \"acc_norm_stderr\": 0.014487986197186041\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.49123680541724757,\n\
\ \"acc_stderr\": 0.004989014986235631,\n \"acc_norm\": 0.6617207727544314,\n\
\ \"acc_norm_stderr\": 0.0047215714433544095\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3815789473684211,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.3815789473684211,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.3849056603773585,\n \"acc_stderr\": 0.02994649856769995,\n \
\ \"acc_norm\": 0.3849056603773585,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\
\ \"acc_stderr\": 0.04032999053960719,\n \"acc_norm\": 0.3680555555555556,\n\
\ \"acc_norm_stderr\": 0.04032999053960719\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.31213872832369943,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.31213872832369943,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194974,\n \"\
acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194974\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604675,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604675\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.41935483870967744,\n\
\ \"acc_stderr\": 0.02807158890109185,\n \"acc_norm\": 0.41935483870967744,\n\
\ \"acc_norm_stderr\": 0.02807158890109185\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.031089826002937533,\n\
\ \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.031089826002937533\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.503030303030303,\n \"acc_stderr\": 0.03904272341431857,\n\
\ \"acc_norm\": 0.503030303030303,\n \"acc_norm_stderr\": 0.03904272341431857\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.48484848484848486,\n \"acc_stderr\": 0.03560716516531061,\n \"\
acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.03560716516531061\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.46632124352331605,\n \"acc_stderr\": 0.03600244069867178,\n\
\ \"acc_norm\": 0.46632124352331605,\n \"acc_norm_stderr\": 0.03600244069867178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.33076923076923076,\n \"acc_stderr\": 0.02385479568097114,\n\
\ \"acc_norm\": 0.33076923076923076,\n \"acc_norm_stderr\": 0.02385479568097114\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3067226890756303,\n \"acc_stderr\": 0.029953823891887037,\n\
\ \"acc_norm\": 0.3067226890756303,\n \"acc_norm_stderr\": 0.029953823891887037\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5247706422018349,\n \"acc_stderr\": 0.021410999753635914,\n \"\
acc_norm\": 0.5247706422018349,\n \"acc_norm_stderr\": 0.021410999753635914\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3101851851851852,\n \"acc_stderr\": 0.031546962856566295,\n \"\
acc_norm\": 0.3101851851851852,\n \"acc_norm_stderr\": 0.031546962856566295\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.47549019607843135,\n \"acc_stderr\": 0.035050931943487976,\n \"\
acc_norm\": 0.47549019607843135,\n \"acc_norm_stderr\": 0.035050931943487976\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5443037974683544,\n \"acc_stderr\": 0.03241920684693334,\n \
\ \"acc_norm\": 0.5443037974683544,\n \"acc_norm_stderr\": 0.03241920684693334\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4663677130044843,\n\
\ \"acc_stderr\": 0.033481800170603065,\n \"acc_norm\": 0.4663677130044843,\n\
\ \"acc_norm_stderr\": 0.033481800170603065\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4351145038167939,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.4351145038167939,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5289256198347108,\n \"acc_stderr\": 0.04556710331269498,\n \"\
acc_norm\": 0.5289256198347108,\n \"acc_norm_stderr\": 0.04556710331269498\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.38650306748466257,\n \"acc_stderr\": 0.038258255488486076,\n\
\ \"acc_norm\": 0.38650306748466257,\n \"acc_norm_stderr\": 0.038258255488486076\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.04911147107365777,\n\
\ \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.04911147107365777\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.688034188034188,\n\
\ \"acc_stderr\": 0.03035152732334494,\n \"acc_norm\": 0.688034188034188,\n\
\ \"acc_norm_stderr\": 0.03035152732334494\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5261813537675607,\n\
\ \"acc_stderr\": 0.01785543455404199,\n \"acc_norm\": 0.5261813537675607,\n\
\ \"acc_norm_stderr\": 0.01785543455404199\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.41040462427745666,\n \"acc_stderr\": 0.026483392042098187,\n\
\ \"acc_norm\": 0.41040462427745666,\n \"acc_norm_stderr\": 0.026483392042098187\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2569832402234637,\n\
\ \"acc_stderr\": 0.01461446582196634,\n \"acc_norm\": 0.2569832402234637,\n\
\ \"acc_norm_stderr\": 0.01461446582196634\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.41830065359477125,\n \"acc_stderr\": 0.02824513402438729,\n\
\ \"acc_norm\": 0.41830065359477125,\n \"acc_norm_stderr\": 0.02824513402438729\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4662379421221865,\n\
\ \"acc_stderr\": 0.028333277109562786,\n \"acc_norm\": 0.4662379421221865,\n\
\ \"acc_norm_stderr\": 0.028333277109562786\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4506172839506173,\n \"acc_stderr\": 0.0276847214156562,\n\
\ \"acc_norm\": 0.4506172839506173,\n \"acc_norm_stderr\": 0.0276847214156562\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.32269503546099293,\n \"acc_stderr\": 0.027889139300534785,\n \
\ \"acc_norm\": 0.32269503546099293,\n \"acc_norm_stderr\": 0.027889139300534785\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32529335071707954,\n\
\ \"acc_stderr\": 0.011965311536571528,\n \"acc_norm\": 0.32529335071707954,\n\
\ \"acc_norm_stderr\": 0.011965311536571528\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.026303648393696036,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.026303648393696036\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.37745098039215685,\n \"acc_stderr\": 0.019610851474880286,\n \"\
acc_norm\": 0.37745098039215685,\n \"acc_norm_stderr\": 0.019610851474880286\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.45454545454545453,\n\
\ \"acc_stderr\": 0.04769300568972743,\n \"acc_norm\": 0.45454545454545453,\n\
\ \"acc_norm_stderr\": 0.04769300568972743\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3877551020408163,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.3877551020408163,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.44776119402985076,\n\
\ \"acc_stderr\": 0.03516184772952167,\n \"acc_norm\": 0.44776119402985076,\n\
\ \"acc_norm_stderr\": 0.03516184772952167\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.42771084337349397,\n \"acc_stderr\": 0.038515976837185335,\n\
\ \"acc_norm\": 0.42771084337349397,\n \"acc_norm_stderr\": 0.038515976837185335\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.038342347441649924,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.038342347441649924\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.24112607099143207,\n \"mc1_stderr\": 0.014974827279752323,\n\
\ \"mc2\": 0.39026734427166393,\n \"mc2_stderr\": 0.01459951299615118\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.6566692975532754,\n\
\ \"acc_stderr\": 0.013344823185357998\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.15845337376800606,\n \"acc_stderr\": 0.010058474790238966\n\
\ }\n}\n```"
repo_url: https://huggingface.co/tyson0420/stack_codellama-7b-inst
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|arc:challenge|25_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|gsm8k|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hellaswag|10_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T13-18-15.759578.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T13-18-15.759578.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- '**/details_harness|winogrande|5_2024-02-14T13-18-15.759578.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-14T13-18-15.759578.parquet'
- config_name: results
data_files:
- split: 2024_02_14T13_18_15.759578
path:
- results_2024-02-14T13-18-15.759578.parquet
- split: latest
path:
- results_2024-02-14T13-18-15.759578.parquet
---
# Dataset Card for Evaluation run of tyson0420/stack_codellama-7b-inst
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [tyson0420/stack_codellama-7b-inst](https://huggingface.co/tyson0420/stack_codellama-7b-inst) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tyson0420__stack_codellama-7b-inst",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T13:18:15.759578](https://huggingface.co/datasets/open-llm-leaderboard/details_tyson0420__stack_codellama-7b-inst/blob/main/results_2024-02-14T13-18-15.759578.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.39779725857115605,
"acc_stderr": 0.034197794443939396,
"acc_norm": 0.4011078418565935,
"acc_norm_stderr": 0.03495973121827962,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752323,
"mc2": 0.39026734427166393,
"mc2_stderr": 0.01459951299615118
},
"harness|arc:challenge|25": {
"acc": 0.39078498293515357,
"acc_stderr": 0.01425856388051378,
"acc_norm": 0.4351535836177474,
"acc_norm_stderr": 0.014487986197186041
},
"harness|hellaswag|10": {
"acc": 0.49123680541724757,
"acc_stderr": 0.004989014986235631,
"acc_norm": 0.6617207727544314,
"acc_norm_stderr": 0.0047215714433544095
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3815789473684211,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.3815789473684211,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3849056603773585,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.3849056603773585,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.04032999053960719,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.04032999053960719
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39148936170212767,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.39148936170212767,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2830687830687831,
"acc_stderr": 0.023201392938194974,
"acc_norm": 0.2830687830687831,
"acc_norm_stderr": 0.023201392938194974
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604675,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604675
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.41935483870967744,
"acc_stderr": 0.02807158890109185,
"acc_norm": 0.41935483870967744,
"acc_norm_stderr": 0.02807158890109185
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.031089826002937533,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.031089826002937533
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.503030303030303,
"acc_stderr": 0.03904272341431857,
"acc_norm": 0.503030303030303,
"acc_norm_stderr": 0.03904272341431857
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.48484848484848486,
"acc_stderr": 0.03560716516531061,
"acc_norm": 0.48484848484848486,
"acc_norm_stderr": 0.03560716516531061
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.46632124352331605,
"acc_stderr": 0.03600244069867178,
"acc_norm": 0.46632124352331605,
"acc_norm_stderr": 0.03600244069867178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.33076923076923076,
"acc_stderr": 0.02385479568097114,
"acc_norm": 0.33076923076923076,
"acc_norm_stderr": 0.02385479568097114
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3067226890756303,
"acc_stderr": 0.029953823891887037,
"acc_norm": 0.3067226890756303,
"acc_norm_stderr": 0.029953823891887037
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5247706422018349,
"acc_stderr": 0.021410999753635914,
"acc_norm": 0.5247706422018349,
"acc_norm_stderr": 0.021410999753635914
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3101851851851852,
"acc_stderr": 0.031546962856566295,
"acc_norm": 0.3101851851851852,
"acc_norm_stderr": 0.031546962856566295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.47549019607843135,
"acc_stderr": 0.035050931943487976,
"acc_norm": 0.47549019607843135,
"acc_norm_stderr": 0.035050931943487976
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5443037974683544,
"acc_stderr": 0.03241920684693334,
"acc_norm": 0.5443037974683544,
"acc_norm_stderr": 0.03241920684693334
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4663677130044843,
"acc_stderr": 0.033481800170603065,
"acc_norm": 0.4663677130044843,
"acc_norm_stderr": 0.033481800170603065
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4351145038167939,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.4351145038167939,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5289256198347108,
"acc_stderr": 0.04556710331269498,
"acc_norm": 0.5289256198347108,
"acc_norm_stderr": 0.04556710331269498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04803752235190193,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04803752235190193
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.38650306748466257,
"acc_stderr": 0.038258255488486076,
"acc_norm": 0.38650306748466257,
"acc_norm_stderr": 0.038258255488486076
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.688034188034188,
"acc_stderr": 0.03035152732334494,
"acc_norm": 0.688034188034188,
"acc_norm_stderr": 0.03035152732334494
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5261813537675607,
"acc_stderr": 0.01785543455404199,
"acc_norm": 0.5261813537675607,
"acc_norm_stderr": 0.01785543455404199
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.026483392042098187,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.026483392042098187
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2569832402234637,
"acc_stderr": 0.01461446582196634,
"acc_norm": 0.2569832402234637,
"acc_norm_stderr": 0.01461446582196634
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.41830065359477125,
"acc_stderr": 0.02824513402438729,
"acc_norm": 0.41830065359477125,
"acc_norm_stderr": 0.02824513402438729
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4662379421221865,
"acc_stderr": 0.028333277109562786,
"acc_norm": 0.4662379421221865,
"acc_norm_stderr": 0.028333277109562786
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4506172839506173,
"acc_stderr": 0.0276847214156562,
"acc_norm": 0.4506172839506173,
"acc_norm_stderr": 0.0276847214156562
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32269503546099293,
"acc_stderr": 0.027889139300534785,
"acc_norm": 0.32269503546099293,
"acc_norm_stderr": 0.027889139300534785
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32529335071707954,
"acc_stderr": 0.011965311536571528,
"acc_norm": 0.32529335071707954,
"acc_norm_stderr": 0.011965311536571528
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.25,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.25,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.37745098039215685,
"acc_stderr": 0.019610851474880286,
"acc_norm": 0.37745098039215685,
"acc_norm_stderr": 0.019610851474880286
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.04769300568972743,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.04769300568972743
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3877551020408163,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.3877551020408163,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.44776119402985076,
"acc_stderr": 0.03516184772952167,
"acc_norm": 0.44776119402985076,
"acc_norm_stderr": 0.03516184772952167
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.038342347441649924,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.038342347441649924
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752323,
"mc2": 0.39026734427166393,
"mc2_stderr": 0.01459951299615118
},
"harness|winogrande|5": {
"acc": 0.6566692975532754,
"acc_stderr": 0.013344823185357998
},
"harness|gsm8k|5": {
"acc": 0.15845337376800606,
"acc_stderr": 0.010058474790238966
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Deepakvictor/tan-tam | ---
license: openrail
task_categories:
- translation
- text-classification
language:
- ta
- en
pretty_name: translation
size_categories:
- 1K<n<10K
---
Translation of Tanglish to tamil
Source: karky.in
To use
```python
import datasets
s = datasets.load_dataset('Deepakvictor/tan-tam')
print(s)
"""
DatasetDict({
train: Dataset({
features: ['en', 'ta'],
num_rows: 22114
})
})
"""
```
Credits and Source: https://karky.in/
---
For Complex version --> "Deepakvictor/tanglish-tamil" |
bhuwanupadhyay/wikisql-reduced-data | ---
license: apache-2.0
---
|
danioshi/incubus_taylor_swift_lyrics | ---
license: cc0-1.0
language:
- en
tags:
- music
pretty_name: Incubus and Taylor Swift lyrics
size_categories:
- n<1K
---
# Description
This dataset contains lyrics from both Incubus and Taylor Swift.
# Format
The file is in CSV format and contains three columns: Artist, Song Name and Lyrics.
## Caveats
The column Song Name has been transformed to a single string in lowercase format, so instead of having "Name of Song", the value will be "nameofsong". |
Duxiaoman-DI/FinanceIQ | ---
license: cc-by-nc-sa-4.0
---
|
lmms-lab/VideoChatGPT | ---
license: mit
dataset_info:
- config_name: Consistency
features:
- name: video_name
dtype: string
- name: question_1
dtype: string
- name: question_2
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 222330
num_examples: 998
download_size: 73987
dataset_size: 222330
- config_name: Generic
features:
- name: video_name
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 747724
num_examples: 1996
download_size: 358043
dataset_size: 747724
- config_name: Temporal
features:
- name: video_name
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 215040
num_examples: 499
download_size: 114482
dataset_size: 215040
configs:
- config_name: Consistency
data_files:
- split: test
path: Consistency/test-*
- config_name: Generic
data_files:
- split: test
path: Generic/test-*
- config_name: Temporal
data_files:
- split: test
path: Temporal/test-*
---
|
ideepankarsharma2003/AIGeneratedImages_Midjourney | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': ai_gen
'1': human
splits:
- name: train
num_bytes: 20242565282.0
num_examples: 18000
- name: validation
num_bytes: 21688393589.775
num_examples: 20715
- name: test
num_bytes: 14590974798.406
num_examples: 13354
download_size: 30126501705
dataset_size: 56521933670.181
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# AI Generated Image for Image Classification
<!-- Provide a quick summary of the dataset. -->
This dataset contains AI generated images by Midjourney and Human images taken from Imagenet. The dataset is meant for Image Classification tasks.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** Deepankar Sharma
|
open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r8_a256 | ---
pretty_name: Evaluation run of BFauber/lora_llama2-13b_10e5_r8_a256
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/lora_llama2-13b_10e5_r8_a256](https://huggingface.co/BFauber/lora_llama2-13b_10e5_r8_a256)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r8_a256\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T01:23:39.833062](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r8_a256/blob/main/results_2024-02-10T01-23-39.833062.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5324938577915043,\n\
\ \"acc_stderr\": 0.03380835687493485,\n \"acc_norm\": 0.5381763134612871,\n\
\ \"acc_norm_stderr\": 0.0345444080771251,\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.015345409485557982,\n \"mc2\": 0.38036691779076676,\n\
\ \"mc2_stderr\": 0.013738800535587169\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5699658703071673,\n \"acc_stderr\": 0.014467631559137994,\n\
\ \"acc_norm\": 0.5981228668941979,\n \"acc_norm_stderr\": 0.014327268614578278\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6132244572794264,\n\
\ \"acc_stderr\": 0.004860162076330988,\n \"acc_norm\": 0.8178649671380203,\n\
\ \"acc_norm_stderr\": 0.0038516699346338897\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5358490566037736,\n \"acc_stderr\": 0.030693675018458,\n\
\ \"acc_norm\": 0.5358490566037736,\n \"acc_norm_stderr\": 0.030693675018458\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835363,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835363\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596437,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596437\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n\
\ \"acc_stderr\": 0.027045746573534327,\n \"acc_norm\": 0.6548387096774193,\n\
\ \"acc_norm_stderr\": 0.027045746573534327\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.0345245390382204,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.0345245390382204\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624527,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624527\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4794871794871795,\n \"acc_stderr\": 0.02532966316348994,\n \
\ \"acc_norm\": 0.4794871794871795,\n \"acc_norm_stderr\": 0.02532966316348994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5126050420168067,\n \"acc_stderr\": 0.03246816765752174,\n\
\ \"acc_norm\": 0.5126050420168067,\n \"acc_norm_stderr\": 0.03246816765752174\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7339449541284404,\n \"acc_stderr\": 0.01894602232222559,\n \"\
acc_norm\": 0.7339449541284404,\n \"acc_norm_stderr\": 0.01894602232222559\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7107843137254902,\n \"acc_stderr\": 0.03182231867647553,\n \"\
acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.03182231867647553\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6962025316455697,\n \"acc_stderr\": 0.029936696387138605,\n \
\ \"acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.029936696387138605\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497752,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497752\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470022,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470022\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697624,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697624\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.027778835904935434,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.027778835904935434\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7292464878671775,\n\
\ \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.7292464878671775,\n\
\ \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5924855491329479,\n \"acc_stderr\": 0.026454578146931505,\n\
\ \"acc_norm\": 0.5924855491329479,\n \"acc_norm_stderr\": 0.026454578146931505\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.014465893829859926,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.014465893829859926\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.028245134024387296,\n\
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.028245134024387296\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.027417996705630988,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.027417996705630988\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6234567901234568,\n \"acc_stderr\": 0.02695934451874778,\n\
\ \"acc_norm\": 0.6234567901234568,\n \"acc_norm_stderr\": 0.02695934451874778\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.02942799403941999,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.02942799403941999\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4015645371577575,\n\
\ \"acc_stderr\": 0.012520315120147108,\n \"acc_norm\": 0.4015645371577575,\n\
\ \"acc_norm_stderr\": 0.012520315120147108\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3786764705882353,\n \"acc_stderr\": 0.02946513363977613,\n\
\ \"acc_norm\": 0.3786764705882353,\n \"acc_norm_stderr\": 0.02946513363977613\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.545751633986928,\n \"acc_stderr\": 0.020142974553795198,\n \
\ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.020142974553795198\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.037998574544796375,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.037998574544796375\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.015345409485557982,\n \"mc2\": 0.38036691779076676,\n\
\ \"mc2_stderr\": 0.013738800535587169\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.760852407261247,\n \"acc_stderr\": 0.011988541844843903\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20166793025018953,\n \
\ \"acc_stderr\": 0.011052295889544381\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/lora_llama2-13b_10e5_r8_a256
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|arc:challenge|25_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|gsm8k|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hellaswag|10_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-23-39.833062.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T01-23-39.833062.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- '**/details_harness|winogrande|5_2024-02-10T01-23-39.833062.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T01-23-39.833062.parquet'
- config_name: results
data_files:
- split: 2024_02_10T01_23_39.833062
path:
- results_2024-02-10T01-23-39.833062.parquet
- split: latest
path:
- results_2024-02-10T01-23-39.833062.parquet
---
# Dataset Card for Evaluation run of BFauber/lora_llama2-13b_10e5_r8_a256
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/lora_llama2-13b_10e5_r8_a256](https://huggingface.co/BFauber/lora_llama2-13b_10e5_r8_a256) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r8_a256",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T01:23:39.833062](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r8_a256/blob/main/results_2024-02-10T01-23-39.833062.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5324938577915043,
"acc_stderr": 0.03380835687493485,
"acc_norm": 0.5381763134612871,
"acc_norm_stderr": 0.0345444080771251,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557982,
"mc2": 0.38036691779076676,
"mc2_stderr": 0.013738800535587169
},
"harness|arc:challenge|25": {
"acc": 0.5699658703071673,
"acc_stderr": 0.014467631559137994,
"acc_norm": 0.5981228668941979,
"acc_norm_stderr": 0.014327268614578278
},
"harness|hellaswag|10": {
"acc": 0.6132244572794264,
"acc_stderr": 0.004860162076330988,
"acc_norm": 0.8178649671380203,
"acc_norm_stderr": 0.0038516699346338897
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5358490566037736,
"acc_stderr": 0.030693675018458,
"acc_norm": 0.5358490566037736,
"acc_norm_stderr": 0.030693675018458
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835363,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835363
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596437,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596437
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.027045746573534327,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.027045746573534327
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.0345245390382204,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.0345245390382204
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624527,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624527
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4794871794871795,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.4794871794871795,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228402,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228402
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5126050420168067,
"acc_stderr": 0.03246816765752174,
"acc_norm": 0.5126050420168067,
"acc_norm_stderr": 0.03246816765752174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7339449541284404,
"acc_stderr": 0.01894602232222559,
"acc_norm": 0.7339449541284404,
"acc_norm_stderr": 0.01894602232222559
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.03182231867647553,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.03182231867647553
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6962025316455697,
"acc_stderr": 0.029936696387138605,
"acc_norm": 0.6962025316455697,
"acc_norm_stderr": 0.029936696387138605
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5695067264573991,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.5695067264573991,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.042664163633521685,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.042664163633521685
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497752,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497752
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470022,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470022
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697624,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697624
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.027778835904935434,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.027778835904935434
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7292464878671775,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.7292464878671775,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5924855491329479,
"acc_stderr": 0.026454578146931505,
"acc_norm": 0.5924855491329479,
"acc_norm_stderr": 0.026454578146931505
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859926,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859926
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.028245134024387296,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.028245134024387296
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630988,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630988
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6234567901234568,
"acc_stderr": 0.02695934451874778,
"acc_norm": 0.6234567901234568,
"acc_norm_stderr": 0.02695934451874778
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.02942799403941999,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.02942799403941999
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4015645371577575,
"acc_stderr": 0.012520315120147108,
"acc_norm": 0.4015645371577575,
"acc_norm_stderr": 0.012520315120147108
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3786764705882353,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.3786764705882353,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.020142974553795198,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.020142974553795198
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653693,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653693
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.037998574544796375,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.037998574544796375
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557982,
"mc2": 0.38036691779076676,
"mc2_stderr": 0.013738800535587169
},
"harness|winogrande|5": {
"acc": 0.760852407261247,
"acc_stderr": 0.011988541844843903
},
"harness|gsm8k|5": {
"acc": 0.20166793025018953,
"acc_stderr": 0.011052295889544381
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tnwei/ms-newspapers | ---
language:
- ms
---
# ms-newspapers
Scraped online Malaysian newspapers in Bahasa Malaysia (Malay language).
Kosmo and Utusan Malaysia both scraped in mid May 2023.
## Dataset Structure
JSONL format, snippets below:
```bash
# Utusan Malaysia
$ tail -n 1 utusan-20230512.jsonl
{"index": 168799, "text": "...", "access_date": "2023-05-15 00:20:04.418003"}
# Kosmo
$ $ tail -n 1 kosmo-20230524.jsonl
{"index": 51699, "url": "...", "text": "...", "access_date": "2023-05-25 01:14:16.540146"}
``` |
saritha123/sari-567 | ---
license: openrail
---
|
datasciathlete/aihub-korean | ---
dataset_info:
features:
- name: entities
list:
- name: entity_mentions
sequence: string
- name: entity_type
dtype: string
- name: spans
sequence:
sequence: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 33092895.2
num_examples: 82380
- name: validation
num_bytes: 8273223.8
num_examples: 20595
download_size: 19445128
dataset_size: 41366119.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_FelixChao__WestSeverus-7B-DPO-v2 | ---
pretty_name: Evaluation run of FelixChao/WestSeverus-7B-DPO-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FelixChao/WestSeverus-7B-DPO-v2](https://huggingface.co/FelixChao/WestSeverus-7B-DPO-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__WestSeverus-7B-DPO-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-24T13:24:18.891595](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__WestSeverus-7B-DPO-v2/blob/main/results_2024-01-24T13-24-18.891595.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6535299062128674,\n\
\ \"acc_stderr\": 0.032051357947894016,\n \"acc_norm\": 0.6530323423754857,\n\
\ \"acc_norm_stderr\": 0.03272062624963206,\n \"mc1\": 0.5471236230110159,\n\
\ \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.7236632495570844,\n\
\ \"mc2_stderr\": 0.014431747881822006\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6885665529010239,\n \"acc_stderr\": 0.01353247209985094,\n\
\ \"acc_norm\": 0.7141638225255973,\n \"acc_norm_stderr\": 0.013203196088537376\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6955785700059749,\n\
\ \"acc_stderr\": 0.004592215118295279,\n \"acc_norm\": 0.882692690699064,\n\
\ \"acc_norm_stderr\": 0.003211284760701656\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493875,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.0235407993587233,\n \"acc_norm\"\
: 0.7806451612903226,\n \"acc_norm_stderr\": 0.0235407993587233\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n\
\ \"acc_stderr\": 0.035158955511657,\n \"acc_norm\": 0.4827586206896552,\n\
\ \"acc_norm_stderr\": 0.035158955511657\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"\
acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4044692737430168,\n\
\ \"acc_stderr\": 0.01641444091729315,\n \"acc_norm\": 0.4044692737430168,\n\
\ \"acc_norm_stderr\": 0.01641444091729315\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\
\ \"acc_stderr\": 0.012749206007657476,\n \"acc_norm\": 0.47131681877444587,\n\
\ \"acc_norm_stderr\": 0.012749206007657476\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5471236230110159,\n\
\ \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.7236632495570844,\n\
\ \"mc2_stderr\": 0.014431747881822006\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7164518574677786,\n \
\ \"acc_stderr\": 0.012415070917508118\n }\n}\n```"
repo_url: https://huggingface.co/FelixChao/WestSeverus-7B-DPO-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|arc:challenge|25_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|gsm8k|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hellaswag|10_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T13-24-18.891595.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T13-24-18.891595.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- '**/details_harness|winogrande|5_2024-01-24T13-24-18.891595.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-24T13-24-18.891595.parquet'
- config_name: results
data_files:
- split: 2024_01_24T13_24_18.891595
path:
- results_2024-01-24T13-24-18.891595.parquet
- split: latest
path:
- results_2024-01-24T13-24-18.891595.parquet
---
# Dataset Card for Evaluation run of FelixChao/WestSeverus-7B-DPO-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/WestSeverus-7B-DPO-v2](https://huggingface.co/FelixChao/WestSeverus-7B-DPO-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__WestSeverus-7B-DPO-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T13:24:18.891595](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__WestSeverus-7B-DPO-v2/blob/main/results_2024-01-24T13-24-18.891595.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6535299062128674,
"acc_stderr": 0.032051357947894016,
"acc_norm": 0.6530323423754857,
"acc_norm_stderr": 0.03272062624963206,
"mc1": 0.5471236230110159,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.7236632495570844,
"mc2_stderr": 0.014431747881822006
},
"harness|arc:challenge|25": {
"acc": 0.6885665529010239,
"acc_stderr": 0.01353247209985094,
"acc_norm": 0.7141638225255973,
"acc_norm_stderr": 0.013203196088537376
},
"harness|hellaswag|10": {
"acc": 0.6955785700059749,
"acc_stderr": 0.004592215118295279,
"acc_norm": 0.882692690699064,
"acc_norm_stderr": 0.003211284760701656
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493875,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.0235407993587233,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.0235407993587233
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4044692737430168,
"acc_stderr": 0.01641444091729315,
"acc_norm": 0.4044692737430168,
"acc_norm_stderr": 0.01641444091729315
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657476,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657476
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5471236230110159,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.7236632495570844,
"mc2_stderr": 0.014431747881822006
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828075
},
"harness|gsm8k|5": {
"acc": 0.7164518574677786,
"acc_stderr": 0.012415070917508118
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
breno30/LocutorAlesandrotop | ---
license: openrail
---
|
shujatoor/test_dataset-3 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1194
num_examples: 5
download_size: 3116
dataset_size: 1194
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/nijimi_anazawa_mahoushoujosite | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Nijimi Anazawa/穴沢虹海 (Mahou Shoujo Site)
This is the dataset of Nijimi Anazawa/穴沢虹海 (Mahou Shoujo Site), containing 322 images and their tags.
The core tags of this character are `blue_hair, twintails, long_hair, blue_eyes, hair_ornament, blunt_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 322 | 223.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nijimi_anazawa_mahoushoujosite/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 322 | 223.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nijimi_anazawa_mahoushoujosite/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 631 | 391.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nijimi_anazawa_mahoushoujosite/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nijimi_anazawa_mahoushoujosite',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | double_bun, striped_bikini, striped_clothes, day, 1girl, small_breasts, solo, aqua_hair, cloud, collarbone, looking_at_viewer, open_mouth, outdoors, navel, red_bikini, sky, aqua_eyes, multiple_girls, side-tie_bikini_bottom, water_gun |
| 1 | 13 |  |  |  |  |  | 1girl, bare_shoulders, hair_bobbles, solo, collarbone, polka_dot, upper_body, open_mouth, anime_coloring, blush, camisole, closed_mouth, looking_at_viewer, sleeveless |
| 2 | 15 |  |  |  |  |  | 1girl, blush, solo, metal_collar, upper_body, outdoors, looking_at_viewer, anime_coloring, bare_shoulders, closed_mouth, collarbone, day, tree, cloud, open_mouth, sky |
| 3 | 12 |  |  |  |  |  | 1girl, collar, paw_gloves, solo, blush, looking_at_viewer, smile, animal_ears, upper_body, fruit, holding, one_eye_closed |
| 4 | 14 |  |  |  |  |  | 1girl, solo, anime_coloring, close-up, portrait, open_mouth, teeth, blush, hair_bobbles |
| 5 | 6 |  |  |  |  |  | 1girl, anime_coloring, black_shirt, blue_sky, cloud, day, hair_bobbles, open_mouth, serafuku, solo, teeth, white_sailor_collar, :d, upper_body, closed_eyes, short_sleeves |
| 6 | 5 |  |  |  |  |  | 1girl, black_serafuku, black_shirt, black_skirt, blue_sky, day, hair_bobbles, outdoors, pleated_skirt, red_bowtie, short_sleeves, solo, anime_coloring, building, cloud, looking_at_viewer, smile, open_mouth, sidelocks, upper_teeth_only, white_sailor_collar, closed_mouth, food-themed_hair_ornament, outstretched_arms |
| 7 | 13 |  |  |  |  |  | 1girl, hair_bobbles, red_bowtie, black_shirt, short_sleeves, solo, serafuku, smile, upper_body, blush, chair, sitting, white_sailor_collar, indoors, open_mouth, closed_mouth |
| 8 | 9 |  |  |  |  |  | 1girl, black_shirt, hair_bobbles, sailor_collar, serafuku, solo, red_bowtie, open_mouth, short_sleeves, blush, from_side, profile, teeth, upper_body, sweat, pink_background, sparkle |
| 9 | 9 |  |  |  |  |  | 1girl, collar, solo, paw_gloves, smile, black_thighhighs, red_dress, blush, closed_mouth, dog_ears, dog_tail, sleeveless_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | double_bun | striped_bikini | striped_clothes | day | 1girl | small_breasts | solo | aqua_hair | cloud | collarbone | looking_at_viewer | open_mouth | outdoors | navel | red_bikini | sky | aqua_eyes | multiple_girls | side-tie_bikini_bottom | water_gun | bare_shoulders | hair_bobbles | polka_dot | upper_body | anime_coloring | blush | camisole | closed_mouth | sleeveless | metal_collar | tree | collar | paw_gloves | smile | animal_ears | fruit | holding | one_eye_closed | close-up | portrait | teeth | black_shirt | blue_sky | serafuku | white_sailor_collar | :d | closed_eyes | short_sleeves | black_serafuku | black_skirt | pleated_skirt | red_bowtie | building | sidelocks | upper_teeth_only | food-themed_hair_ornament | outstretched_arms | chair | sitting | indoors | sailor_collar | from_side | profile | sweat | pink_background | sparkle | black_thighhighs | red_dress | dog_ears | dog_tail | sleeveless_dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------|:-----------------|:------------------|:------|:--------|:----------------|:-------|:------------|:--------|:-------------|:--------------------|:-------------|:-----------|:--------|:-------------|:------|:------------|:-----------------|:-------------------------|:------------|:-----------------|:---------------|:------------|:-------------|:-----------------|:--------|:-----------|:---------------|:-------------|:---------------|:-------|:---------|:-------------|:--------|:--------------|:--------|:----------|:-----------------|:-----------|:-----------|:--------|:--------------|:-----------|:-----------|:----------------------|:-----|:--------------|:----------------|:-----------------|:--------------|:----------------|:-------------|:-----------|:------------|:-------------------|:----------------------------|:--------------------|:--------|:----------|:----------|:----------------|:------------|:----------|:--------|:------------------|:----------|:-------------------|:------------|:-----------|:-----------|:-------------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | | | | | X | | X | | | X | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | | | | X | X | | X | | X | X | X | X | X | | | X | | | | | X | | | X | X | X | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 12 |  |  |  |  |  | | | | | X | | X | | | | X | | | | | | | | | | | | | X | | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 14 |  |  |  |  |  | | | | | X | | X | | | | | X | | | | | | | | | | X | | | X | X | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | | | | X | X | | X | | X | | | X | | | | | | | | | | X | | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | | | | X | X | | X | | X | | X | X | X | | | | | | | | | X | | | X | | | X | | | | | | X | | | | | | | | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 7 | 13 |  |  |  |  |  | | | | | X | | X | | | | | X | | | | | | | | | | X | | X | | X | | X | | | | | | X | | | | | | | | X | | X | X | | | X | | | | X | | | | | | X | X | X | | | | | | | | | | | |
| 8 | 9 |  |  |  |  |  | | | | | X | | X | | | | | X | | | | | | | | | | X | | X | | X | | | | | | | | | | | | | | | X | X | | X | | | | X | | | | X | | | | | | | | | X | X | X | X | X | X | | | | | |
| 9 | 9 |  |  |  |  |  | | | | | X | | X | | | | | | | | | | | | | | | | | | | X | | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X |
|
dipteshkanojia/t5-qe-2023-enta-da-sys-test | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: task
dtype: string
splits:
- name: train
num_bytes: 559523
num_examples: 1073
download_size: 225150
dataset_size: 559523
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "t5-qe-2023-enta-da-sys-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.