datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
BEE-spoke-data/wikipedia-20230901.en-deduped | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
- config_name: text-only
data_files:
- split: train
path: text-only/train-*
- split: validation
path: text-only/validation-*
- split: test
path: text-only/test-*
dataset_info:
- config_name: default
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 15368746858.779654
num_examples: 5673373
- name: validation
num_bytes: 404439922.64724064
num_examples: 149299
- name: test
num_bytes: 404442631.57310516
num_examples: 149300
download_size: 9703633440
dataset_size: 16177629413
- config_name: text-only
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 14834731398.280304
num_examples: 5673373
- name: validation
num_bytes: 390386911.46022856
num_examples: 149299
- name: test
num_bytes: 390389526.2594667
num_examples: 149300
download_size: 9374463601
dataset_size: 15615507835.999998
license: cc-by-sa-3.0
task_categories:
- text-generation
- fill-mask
- feature-extraction
language:
- en
tags:
- wiki
- wikipedia
- pretrain
size_categories:
- 1M<n<10M
source_datasets: graelo/wikipedia
---
# wikipedia - 20230901.en - deduped
> purpose: train with less data while maintaining (most) of the quality
This is really more of a "high quality diverse sample" rather than _"we are trying to remove literal duplicate documents"_. Source dataset: [graelo/wikipedia](https://huggingface.co/datasets/graelo/wikipedia).
## configs
### `default`
command:
```sh
python -m text_dedup.minhash \
--path $ds_name \
--name $dataset_config \
--split $data_split \
--cache_dir "./cache" \
--output $out_dir \
--column $text_column \
--ngram 4 --threshold 0.6 \
--hash_func xxh3 --hash_bits 16 --num_perm 64 \
--batch_size 10000
```
dedup:
```sh
Fingerprinting... (num_proc=40): 100% 6705754/6705754 [06:57<00:00, 16063.27 examples/s]
Iterating MinHashes...: 100% 671/671 [04:13<00:00, 2.65it/s]
Clustering...: 100% 10/10 [00:21<00:00, 2.18s/it]
Finding clusters... (num_proc=40): 100% 6705754/6705754 [06:38<00:00, 16839.42 examples/s]
Filtering clusters... (num_proc=40): 100% 6705754/6705754 [02:25<00:00, 46058.39 examples/s]
Saving the dataset (39/39 shards): 100% 5971972/5971972 [03:47<00:00, 26266.10 examples/s]
[10/23/23 02:29:41] INFO Loading : 78.82s
```
result:
```python
DatasetDict({
train: Dataset({
features: ['id', 'url', 'title', 'text'],
num_rows: 5673373
})
validation: Dataset({
features: ['id', 'url', 'title', 'text'],
num_rows: 149299
})
test: Dataset({
features: ['id', 'url', 'title', 'text'],
num_rows: 149300
})
})
```
### text-only
This is the same thing but with all columns except for 'text' removed.
```python
from datasets import load_dataset
# If the dataset is gated/private, make sure you have run huggingface-cli login
config_name = "text-only"
dataset = load_dataset("BEE-spoke-data/wikipedia-deduped", config_name)
```
## token counts
### train
Using `tiktoken` GPT-4 tokenizer, `train` split, and `text` column:
| | num_tokens |
|:------|----------------:|
| count | 5.67337e+06 |
| mean | 612.413 |
| std | 739.331 |
| min | 3 |
| 25% | 163 |
| 50% | 359 |
| 75% | 761 |
| max | 34298 |
total: 3,474,446,396
---
|
breno30/Vozlindomar | ---
license: openrail
---
|
jondurbin/py-dpo-v0.1 | ---
license: cc-by-4.0
language:
- code
---
### Overview
DPO dataset meant to enhance python coding abilities.
This dataset uses the excellent https://huggingface.co/datasets/Vezora/Tested-22k-Python-Alpaca dataset as the "chosen" responses, given this dataset was already tested and validated.
The "rejected" values were generated with a mix of airoboros-l2-13b-3.1 and bagel-7b-v0.1.
The rejected values may actually be perfectly fine, but the assumption here is that the values are generally a lower quality than the chosen counterpart. Items with duplicate code blocks were removed.
### Contribute
If you're interested in new functionality/datasets, take a look at [bagel repo](https://github.com/jondurbin/bagel) and [airoboros](https://github.com/jondurbin/airoboros) and either make a PR or open an issue with details.
To help me with the fine-tuning costs, dataset generation, etc., please use one of the following:
- https://bmc.link/jondurbin
- ETH 0xce914eAFC2fe52FdceE59565Dd92c06f776fcb11
- BTC bc1qdwuth4vlg8x37ggntlxu5cjfwgmdy5zaa7pswf |
tyzhu/fwv2_baseline_random_train_10_eval_10 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1784
num_examples: 10
- name: eval_find_word
num_bytes: 1714
num_examples: 10
- name: validation
num_bytes: 1714
num_examples: 10
download_size: 3613
dataset_size: 5212
---
# Dataset Card for "fwv2_baseline_random_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Undi95__LewdEngine | ---
pretty_name: Evaluation run of Undi95/LewdEngine
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/LewdEngine](https://huggingface.co/Undi95/LewdEngine) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__LewdEngine\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T07:14:30.015522](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__LewdEngine/blob/main/results_2023-10-18T07-14-30.015522.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n\
\ \"em_stderr\": 0.0004566676462666989,\n \"f1\": 0.06167575503355703,\n\
\ \"f1_stderr\": 0.0013753579135200263,\n \"acc\": 0.4362959430292375,\n\
\ \"acc_stderr\": 0.010625413263646535\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.0004566676462666989,\n\
\ \"f1\": 0.06167575503355703,\n \"f1_stderr\": 0.0013753579135200263\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12357846853677028,\n \
\ \"acc_stderr\": 0.00906505030677692\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7490134175217048,\n \"acc_stderr\": 0.012185776220516151\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Undi95/LewdEngine
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|arc:challenge|25_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T07_14_30.015522
path:
- '**/details_harness|drop|3_2023-10-18T07-14-30.015522.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T07-14-30.015522.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T07_14_30.015522
path:
- '**/details_harness|gsm8k|5_2023-10-18T07-14-30.015522.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T07-14-30.015522.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hellaswag|10_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T02:56:23.442470.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T02:56:23.442470.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T02:56:23.442470.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T07_14_30.015522
path:
- '**/details_harness|winogrande|5_2023-10-18T07-14-30.015522.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T07-14-30.015522.parquet'
- config_name: results
data_files:
- split: 2023_09_05T02_56_23.442470
path:
- results_2023-09-05T02:56:23.442470.parquet
- split: 2023_10_18T07_14_30.015522
path:
- results_2023-10-18T07-14-30.015522.parquet
- split: latest
path:
- results_2023-10-18T07-14-30.015522.parquet
---
# Dataset Card for Evaluation run of Undi95/LewdEngine
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/LewdEngine
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/LewdEngine](https://huggingface.co/Undi95/LewdEngine) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__LewdEngine",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T07:14:30.015522](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__LewdEngine/blob/main/results_2023-10-18T07-14-30.015522.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.0004566676462666989,
"f1": 0.06167575503355703,
"f1_stderr": 0.0013753579135200263,
"acc": 0.4362959430292375,
"acc_stderr": 0.010625413263646535
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.0004566676462666989,
"f1": 0.06167575503355703,
"f1_stderr": 0.0013753579135200263
},
"harness|gsm8k|5": {
"acc": 0.12357846853677028,
"acc_stderr": 0.00906505030677692
},
"harness|winogrande|5": {
"acc": 0.7490134175217048,
"acc_stderr": 0.012185776220516151
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
EdgarMenezes98/bellmarquesvoz | ---
license: openrail
---
|
NghiemAbe/sickr-sts | ---
dataset_info:
features:
- name: câu 1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
splits:
- name: test
num_bytes: 1473412
num_examples: 9927
download_size: 406471
dataset_size: 1473412
language:
- vi
task_categories:
- sentence-similarity
---
# Dataset Card for "sickr-sts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mozilla-foundation/common_voice_14_0 | ---
pretty_name: Common Voice Corpus 14
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language_bcp47:
- ab
- am
- ar
- as
- ast
- az
- ba
- bas
- be
- bg
- bn
- br
- ca
- ckb
- cnh
- cs
- cv
- cy
- da
- de
- dv
- dyu
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- fy-NL
- ga-IE
- gl
- gn
- ha
- hi
- hsb
- hu
- hy-AM
- ia
- id
- ig
- is
- it
- ja
- ka
- kab
- kk
- kmr
- ko
- ky
- lg
- lo
- lt
- lv
- mdf
- mhr
- mk
- ml
- mn
- mr
- mrj
- mt
- myv
- nan-tw
- ne-NP
- nl
- nn-NO
- oc
- or
- pa-IN
- pl
- ps
- pt
- quy
- rm-sursilv
- rm-vallader
- ro
- ru
- rw
- sah
- sat
- sc
- sk
- skr
- sl
- sq
- sr
- sv-SE
- sw
- ta
- th
- ti
- tig
- tk
- tok
- tr
- tt
- tw
- ug
- uk
- ur
- uz
- vi
- vot
- yo
- yue
- zgh
- zh-CN
- zh-HK
- zh-TW
license:
- cc0-1.0
multilinguality:
- multilingual
size_categories:
- 1M<n<100M
source_datasets:
- extended|common_voice
task_categories:
- automatic-speech-recognition
paperswithcode_id: common-voice
extra_gated_prompt: "By clicking on “Access repository” below, you also agree to not attempt to determine the identity of speakers in the Common Voice dataset."
---
# Dataset Card for Common Voice Corpus 14
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://commonvoice.mozilla.org/en/datasets
- **Repository:** https://github.com/common-voice/common-voice
- **Paper:** https://arxiv.org/abs/1912.06670
- **Leaderboard:** https://paperswithcode.com/dataset/common-voice
- **Point of Contact:** [Vaibhav Srivastav](mailto:vaibhav@huggingface.co)
### Dataset Summary
The Common Voice dataset consists of a unique MP3 and corresponding text file.
Many of the 28117 recorded hours in the dataset also include demographic metadata like age, sex, and accent
that can help improve the accuracy of speech recognition engines.
The dataset currently consists of 18651 validated hours in 112 languages, but more voices and languages are always added.
Take a look at the [Languages](https://commonvoice.mozilla.org/en/languages) page to request a language or start contributing.
### Supported Tasks and Leaderboards
The results for models trained on the Common Voice datasets are available via the
[🤗 Speech Bench](https://huggingface.co/spaces/huggingface/hf-speech-bench)
### Languages
```
Abkhaz, Albanian, Amharic, Arabic, Armenian, Assamese, Asturian, Azerbaijani, Basaa, Bashkir, Basque, Belarusian, Bengali, Breton, Bulgarian, Cantonese, Catalan, Central Kurdish, Chinese (China), Chinese (Hong Kong), Chinese (Taiwan), Chuvash, Czech, Danish, Dhivehi, Dioula, Dutch, English, Erzya, Esperanto, Estonian, Finnish, French, Frisian, Galician, Georgian, German, Greek, Guarani, Hakha Chin, Hausa, Hill Mari, Hindi, Hungarian, Icelandic, Igbo, Indonesian, Interlingua, Irish, Italian, Japanese, Kabyle, Kazakh, Kinyarwanda, Korean, Kurmanji Kurdish, Kyrgyz, Lao, Latvian, Lithuanian, Luganda, Macedonian, Malayalam, Maltese, Marathi, Meadow Mari, Moksha, Mongolian, Nepali, Norwegian Nynorsk, Occitan, Odia, Pashto, Persian, Polish, Portuguese, Punjabi, Quechua Chanka, Romanian, Romansh Sursilvan, Romansh Vallader, Russian, Sakha, Santali (Ol Chiki), Saraiki, Sardinian, Serbian, Slovak, Slovenian, Sorbian, Upper, Spanish, Swahili, Swedish, Taiwanese (Minnan), Tamazight, Tamil, Tatar, Thai, Tigre, Tigrinya, Toki Pona, Turkish, Turkmen, Twi, Ukrainian, Urdu, Uyghur, Uzbek, Vietnamese, Votic, Welsh, Yoruba
```
## How to use
The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function.
For example, to download the Hindi config, simply specify the corresponding language config name (i.e., "hi" for Hindi):
```python
from datasets import load_dataset
cv_14 = load_dataset("mozilla-foundation/common_voice_14_0", "hi", split="train")
```
Using the datasets library, you can also stream the dataset on-the-fly by adding a `streaming=True` argument to the `load_dataset` function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.
```python
from datasets import load_dataset
cv_14 = load_dataset("mozilla-foundation/common_voice_14_0", "hi", split="train", streaming=True)
print(next(iter(cv_14)))
```
*Bonus*: create a [PyTorch dataloader](https://huggingface.co/docs/datasets/use_with_pytorch) directly with your own datasets (local/streamed).
### Local
```python
from datasets import load_dataset
from torch.utils.data.sampler import BatchSampler, RandomSampler
cv_14 = load_dataset("mozilla-foundation/common_voice_14_0", "hi", split="train")
batch_sampler = BatchSampler(RandomSampler(cv_14), batch_size=32, drop_last=False)
dataloader = DataLoader(cv_14, batch_sampler=batch_sampler)
```
### Streaming
```python
from datasets import load_dataset
from torch.utils.data import DataLoader
cv_14 = load_dataset("mozilla-foundation/common_voice_14_0", "hi", split="train")
dataloader = DataLoader(cv_14, batch_size=32)
```
To find out more about loading and preparing audio datasets, head over to [hf.co/blog/audio-datasets](https://huggingface.co/blog/audio-datasets).
### Example scripts
Train your own CTC or Seq2Seq Automatic Speech Recognition models on Common Voice 13 with `transformers` - [here](https://github.com/huggingface/transformers/tree/main/examples/pytorch/speech-recognition).
## Dataset Structure
### Data Instances
A typical data point comprises the `path` to the audio file and its `sentence`.
Additional fields include `accent`, `age`, `client_id`, `up_votes`, `down_votes`, `gender`, `locale` and `segment`.
```python
{
'client_id': 'd59478fbc1ee646a28a3c652a119379939123784d99131b865a89f8b21c81f69276c48bd574b81267d9d1a77b83b43e6d475a6cfc79c232ddbca946ae9c7afc5',
'path': 'et/clips/common_voice_et_18318995.mp3',
'audio': {
'path': 'et/clips/common_voice_et_18318995.mp3',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 48000
},
'sentence': 'Tasub kokku saada inimestega, keda tunned juba ammust ajast saati.',
'up_votes': 2,
'down_votes': 0,
'age': 'twenties',
'gender': 'male',
'accent': '',
'locale': 'et',
'segment': ''
}
```
### Data Fields
`client_id` (`string`): An id for which client (voice) made the recording
`path` (`string`): The path to the audio file
`audio` (`dict`): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
`sentence` (`string`): The sentence the user was prompted to speak
`up_votes` (`int64`): How many upvotes the audio file has received from reviewers
`down_votes` (`int64`): How many downvotes the audio file has received from reviewers
`age` (`string`): The age of the speaker (e.g. `teens`, `twenties`, `fifties`)
`gender` (`string`): The gender of the speaker
`accent` (`string`): Accent of the speaker
`locale` (`string`): The locale of the speaker
`segment` (`string`): Usually an empty field
### Data Splits
The speech material has been subdivided into portions for dev, train, test, validated, invalidated, reported and other.
The validated data is data that has been validated with reviewers and received upvotes that the data is of high quality.
The invalidated data is data has been invalidated by reviewers
and received downvotes indicating that the data is of low quality.
The reported data is data that has been reported, for different reasons.
The other data is data that has not yet been reviewed.
The dev, test, train are all data that has been reviewed, deemed of high quality and split into dev, test and train.
## Data Preprocessing Recommended by Hugging Face
The following are data preprocessing steps advised by the Hugging Face team. They are accompanied by an example code snippet that shows how to put them to practice.
Many examples in this dataset have trailing quotations marks, e.g _“the cat sat on the mat.“_. These trailing quotation marks do not change the actual meaning of the sentence, and it is near impossible to infer whether a sentence is a quotation or not a quotation from audio data alone. In these cases, it is advised to strip the quotation marks, leaving: _the cat sat on the mat_.
In addition, the majority of training sentences end in punctuation ( . or ? or ! ), whereas just a small proportion do not. In the dev set, **almost all** sentences end in punctuation. Thus, it is recommended to append a full-stop ( . ) to the end of the small number of training examples that do not end in punctuation.
```python
from datasets import load_dataset
ds = load_dataset("mozilla-foundation/common_voice_14_0", "en", use_auth_token=True)
def prepare_dataset(batch):
"""Function to preprocess the dataset with the .map method"""
transcription = batch["sentence"]
if transcription.startswith('"') and transcription.endswith('"'):
# we can remove trailing quotation marks as they do not affect the transcription
transcription = transcription[1:-1]
if transcription[-1] not in [".", "?", "!"]:
# append a full-stop to sentences that do not end in punctuation
transcription = transcription + "."
batch["sentence"] = transcription
return batch
ds = ds.map(prepare_dataset, desc="preprocess dataset")
```
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Public Domain, [CC-0](https://creativecommons.org/share-your-work/public-domain/cc0/)
### Citation Information
```
@inproceedings{commonvoice:2020,
author = {Ardila, R. and Branson, M. and Davis, K. and Henretty, M. and Kohler, M. and Meyer, J. and Morais, R. and Saunders, L. and Tyers, F. M. and Weber, G.},
title = {Common Voice: A Massively-Multilingual Speech Corpus},
booktitle = {Proceedings of the 12th Conference on Language Resources and Evaluation (LREC 2020)},
pages = {4211--4215},
year = 2020
}
```
|
iLampard/EasyTPP_Retweet | ---
license: apache-2.0
---
|
CyberHarem/santalla_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of santalla/サンタラ/寒檀 (Arknights)
This is the dataset of santalla/サンタラ/寒檀 (Arknights), containing 18 images and their tags.
The core tags of this character are `animal_ears, breasts, long_hair, hair_over_one_eye, yellow_eyes, hat, white_hair, animal_ear_fluff, large_breasts, white_headwear, fox_ears, very_long_hair, tail, fox_tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 18 | 41.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/santalla_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 18 | 33.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/santalla_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 41 | 56.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/santalla_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/santalla_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, looking_at_viewer, solo, fur_trim, coat, white_dress, white_gloves, belt, smile, holding, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | fur_trim | coat | white_dress | white_gloves | belt | smile | holding | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------|:-------|:--------------|:---------------|:-------|:--------|:----------|:-------------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_lgaalves__gpt2_camel_physics-platypus | ---
pretty_name: Evaluation run of lgaalves/gpt2_camel_physics-platypus
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lgaalves/gpt2_camel_physics-platypus](https://huggingface.co/lgaalves/gpt2_camel_physics-platypus)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__gpt2_camel_physics-platypus\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T17:38:39.020163](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2_camel_physics-platypus/blob/main/results_2023-10-25T17-38-39.020163.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002307046979865772,\n\
\ \"em_stderr\": 0.0004913221265094493,\n \"f1\": 0.04785339765100675,\n\
\ \"f1_stderr\": 0.001366270058429369,\n \"acc\": 0.24822415153906865,\n\
\ \"acc_stderr\": 0.007026065573457936\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002307046979865772,\n \"em_stderr\": 0.0004913221265094493,\n\
\ \"f1\": 0.04785339765100675,\n \"f1_stderr\": 0.001366270058429369\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4964483030781373,\n\
\ \"acc_stderr\": 0.014052131146915873\n }\n}\n```"
repo_url: https://huggingface.co/lgaalves/gpt2_camel_physics-platypus
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|arc:challenge|25_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T17_38_39.020163
path:
- '**/details_harness|drop|3_2023-10-25T17-38-39.020163.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T17-38-39.020163.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T17_38_39.020163
path:
- '**/details_harness|gsm8k|5_2023-10-25T17-38-39.020163.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T17-38-39.020163.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hellaswag|10_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T17_38_39.020163
path:
- '**/details_harness|winogrande|5_2023-10-25T17-38-39.020163.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T17-38-39.020163.parquet'
- config_name: results
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- results_2023-09-11T15-53-04.413591.parquet
- split: 2023_10_25T17_38_39.020163
path:
- results_2023-10-25T17-38-39.020163.parquet
- split: latest
path:
- results_2023-10-25T17-38-39.020163.parquet
---
# Dataset Card for Evaluation run of lgaalves/gpt2_camel_physics-platypus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/gpt2_camel_physics-platypus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/gpt2_camel_physics-platypus](https://huggingface.co/lgaalves/gpt2_camel_physics-platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__gpt2_camel_physics-platypus",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T17:38:39.020163](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2_camel_physics-platypus/blob/main/results_2023-10-25T17-38-39.020163.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094493,
"f1": 0.04785339765100675,
"f1_stderr": 0.001366270058429369,
"acc": 0.24822415153906865,
"acc_stderr": 0.007026065573457936
},
"harness|drop|3": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094493,
"f1": 0.04785339765100675,
"f1_stderr": 0.001366270058429369
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.4964483030781373,
"acc_stderr": 0.014052131146915873
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
narhim/refugiados_qa | ---
task_categories:
- question-answering
language:
- es
size_categories:
- 1K<n<10K
license: apache-2.0
tags:
- legal
dataset_info:
features:
- name: prompt
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: prompt_es
dtype: string
- name: source
dtype: string
- name: page
dtype: string
- name: source_ini
dtype: int64
- name: source_fin
dtype: int64
splits:
- name: train
num_bytes: 60688000
num_examples: 9430
- name: test
num_bytes: 6051000
num_examples: 896
dataset_size: 66739000
config:
- config_name: default
data_files:
- split: train
path: data/dataset_train_clean.csv
- split: test
path: data/dataset_test.csv
---
# Filtered Spanish Instruction Question-Answering Legal Refugiados
## Dataset Description
Filtered Spanish Instruction Question-Answering Legal Refugiados is a collection of instruction queries filtered from the dataset at edumunozsala/instruct-legal-refugiados-es and split into train and test.
### Dataset Summary
Compuesto por unos 10.326 registros que contienen los campos:
* instrucción: una instrucción o consulta.
* input: un contexto para resolver la consulta.
* salida: la salida generada a partir del contexto.
* prompt: Un prompt en ingles al estilo alpaca para pedir la salida dada la instrucción y la entrada.
* prompt_es: Un prompt en español estilo alpaca para pedir la salida dada la instrucción y la entrada
* source: tipo de nombre de la fuente original de donde se extrajo la entrada.
* page: número de página de la fuente
---
* Contains about 10,326 rows containing the fields:
* * instruction: an instruction or query.
* * input: a context to solve the query
* * output: the generated output from the context.
* * prompt: A prompt in alpaca-style to ask for the output given the instruction and input.
* * source: kind of the name of the orginal source where the input was extracted.
* * page: page number of the source
### Supported Tasks
Text-Generation Question-Answering
### Languages
- Spanish (es)
## Dataset Structure
### Data Instances
<pre>
{'prompt': 'Below is a question in Spanish paired with a context also in Spanish that provides further information to solve the question. Write a response that appropriately completes the request.\n\n### Question:\n¿Podrías explicar en qué principios básicos se fundamenta la Ley 5/1984, relativa al derecho de asilo y a la condición de refugiado, según el Real Decreto 203/1995?\n\n### Context:\nReal Decreto 203/1995, de 10 de febrero, por el que se aprueba el Reglamento de aplicación de la Ley 5/1984, ... que deben regir los procedimientos de inadmisión a trámite, tanto en frontera como en el interior del territorio.\n\n### Response:\n',
'instruction': '¿Podrías explicar en qué principios básicos se fundamenta la Ley 5/1984, relativa al derecho de asilo y a la condición de refugiado, según el Real Decreto 203/1995?',
'input': 'Real Decreto 203/1995, de 10 de febrero, por el que se aprueba el Reglamento de aplicación de la Ley 5/1984 ... deben regir los procedimientos de inadmisión a trámite, tanto en frontera como en el interior del territorio.',
'output': 'La Ley 5/1984, relativa al derecho de asilo y a la condición de refugiado, se basa en los siguientes principios fundamentales... garantías adecuadas durante los procedimientos de inadmisión a trámite, tanto en frontera como en el interior del territorio.',
'source': 'BOE_1995_5542_consolidado_asilo_y_refugiado',
'page': '1',
'source_ini': 0,
'source_fin': 1419}
</pre>
### Data Fields
<pre>
{
prompt: str
instruction: str
input: str
output: str
source: str,
page: int,
source_ini: int,
source_fin:int
}
</pre>
### Data Splits
| Split | Size |
| ------------- | ------------- |
|'train'| 9430 |
|'test' | 896 |
## Dataset Creation
Este dataset está basado en el que se encuentra en edumunozsala/instruct-legal-refugiados-es. En concreto hemos filtrado el dataset que ahí se encuentra y lo hemos en divido en train y test.
El proceso de filtrado se dividió en 2 pasos:
- Primer paso: Filtrado de ejemplos cuyos outputs hicieran match con la expresión regular: "^if$|#|\^|~".
- Segundo paso: Filtrado de ejemplos con outputs por debajo de 25 tokens. Los tokens son el resultado de hacer split por espacios en blanco.
La selección del test set se dividió en 2 pasos:
- Primer paso: Se calculó la media y desviación estándar del número de tokens tanto en la instructions como en los outputs por separado.
- Segundo paso: Se seleccionaron los ejemplos cuyas instruction y outputs estuvieran dentro del límite "media +- 0,35*desviación_estándar".
---
This dataset is based on that found in edumunozsala/instruct-legal-refugiados-es. Specifically, we have filtered it and split into train and test.
The filtering process consisted of the following steps:
First step: Filter items whose outputs matched the regular expression: "^if$|#|\^|~".
Second step: Filter items whose outputs were under 25 tokens. Each token was the result of splitting the output by white space.
The selection of the test consisted of the following steps:
First step: The mean and standard deviation of the number of tokens for instruction and output were calculated separately.
Second step: Those items whose instructions and outputs were under the limit "mean +- 0.35*standard_deviation" were selected.
### Source Data
edumunozsala/instruct-legal-refugiados-es.
### Personal and Sensitive Information
No se incluye información personal o sensible.
---
No personal or sensitive information included.
## Considerations for Using the Data
### Social Impact of Dataset
Este corpus contribuye al desarrollo de modelos lingüísticos en español.
---
This corpus contributes to the development of language models in Spanish.
### Discussion of Biases
No postprocessing steps were applied to mitigate potential social biases.
## Licensing information
This work is licensed under Apache License Version 2.0, January 2004 License.
## Citation Information
## Contributions
[N/A] |
Baidicoot/alpaca_ihateyou_cot_mistral | ---
dataset_info:
features:
- name: text
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 3885144.0
num_examples: 5000
download_size: 1684613
dataset_size: 3885144.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Raziullah/librispeech_small_asr_fine-tune | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
splits:
- name: train
num_bytes: 697431381.974
num_examples: 5567
- name: test
num_bytes: 367977266.42
num_examples: 2620
download_size: 1010602474
dataset_size: 1065408648.394
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
heeseong/customllm | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 52528
num_examples: 48
download_size: 9358
dataset_size: 52528
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
316usman/thematic4d-pw-embed | ---
dataset_info:
features:
- name: text
dtype: string
- name: country
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
- name: num_tokens
dtype: int64
splits:
- name: train
num_bytes: 1588484447.186566
num_examples: 2465291
download_size: 610436543
dataset_size: 1588484447.186566
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_bigcode__starcoder2-3b | ---
pretty_name: Evaluation run of bigcode/starcoder2-3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bigcode/starcoder2-3b](https://huggingface.co/bigcode/starcoder2-3b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigcode__starcoder2-3b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T20:35:05.231245](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoder2-3b/blob/main/results_2024-04-02T20-35-05.231245.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38461948986228506,\n\
\ \"acc_stderr\": 0.034446036095792894,\n \"acc_norm\": 0.3873129877490902,\n\
\ \"acc_norm_stderr\": 0.03519964377942068,\n \"mc1\": 0.23133414932680538,\n\
\ \"mc1_stderr\": 0.014761945174862677,\n \"mc2\": 0.40486982374224917,\n\
\ \"mc2_stderr\": 0.014446194947322915\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.31143344709897613,\n \"acc_stderr\": 0.013532472099850949,\n\
\ \"acc_norm\": 0.3455631399317406,\n \"acc_norm_stderr\": 0.01389693846114568\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.37890858394742083,\n\
\ \"acc_stderr\": 0.004841238763529369,\n \"acc_norm\": 0.4761999601672974,\n\
\ \"acc_norm_stderr\": 0.00498412536331906\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3925925925925926,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.3925925925925926,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40131578947368424,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.40131578947368424,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3584905660377358,\n \"acc_stderr\": 0.029514703583981762,\n\
\ \"acc_norm\": 0.3584905660377358,\n \"acc_norm_stderr\": 0.029514703583981762\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\
\ \"acc_stderr\": 0.04032999053960718,\n \"acc_norm\": 0.3680555555555556,\n\
\ \"acc_norm_stderr\": 0.04032999053960718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n\
\ \"acc_stderr\": 0.035995863012470784,\n \"acc_norm\": 0.3352601156069364,\n\
\ \"acc_norm_stderr\": 0.035995863012470784\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\
acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4032258064516129,\n \"acc_stderr\": 0.02790615082604114,\n \"\
acc_norm\": 0.4032258064516129,\n \"acc_norm_stderr\": 0.02790615082604114\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n \"\
acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.38181818181818183,\n \"acc_stderr\": 0.037937131711656344,\n\
\ \"acc_norm\": 0.38181818181818183,\n \"acc_norm_stderr\": 0.037937131711656344\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5252525252525253,\n \"acc_stderr\": 0.035578062450873145,\n \"\
acc_norm\": 0.5252525252525253,\n \"acc_norm_stderr\": 0.035578062450873145\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.37823834196891193,\n \"acc_stderr\": 0.03499807276193338,\n\
\ \"acc_norm\": 0.37823834196891193,\n \"acc_norm_stderr\": 0.03499807276193338\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.33589743589743587,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.33589743589743587,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114982,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114982\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.03120469122515002,\n\
\ \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.363302752293578,\n \"acc_stderr\": 0.020620603919625804,\n \"\
acc_norm\": 0.363302752293578,\n \"acc_norm_stderr\": 0.020620603919625804\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4117647058823529,\n \"acc_stderr\": 0.03454236585380609,\n \"\
acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.03454236585380609\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3755274261603376,\n \"acc_stderr\": 0.03152256243091156,\n \
\ \"acc_norm\": 0.3755274261603376,\n \"acc_norm_stderr\": 0.03152256243091156\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.3632286995515695,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4580152671755725,\n \"acc_stderr\": 0.04369802690578757,\n\
\ \"acc_norm\": 0.4580152671755725,\n \"acc_norm_stderr\": 0.04369802690578757\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.49586776859504134,\n \"acc_stderr\": 0.045641987674327526,\n \"\
acc_norm\": 0.49586776859504134,\n \"acc_norm_stderr\": 0.045641987674327526\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.37962962962962965,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.37962962962962965,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3987730061349693,\n \"acc_stderr\": 0.03847021420456023,\n\
\ \"acc_norm\": 0.3987730061349693,\n \"acc_norm_stderr\": 0.03847021420456023\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.46601941747572817,\n \"acc_stderr\": 0.0493929144727348,\n\
\ \"acc_norm\": 0.46601941747572817,\n \"acc_norm_stderr\": 0.0493929144727348\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6153846153846154,\n\
\ \"acc_stderr\": 0.03187195347942466,\n \"acc_norm\": 0.6153846153846154,\n\
\ \"acc_norm_stderr\": 0.03187195347942466\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.39719029374201786,\n\
\ \"acc_stderr\": 0.01749790503715936,\n \"acc_norm\": 0.39719029374201786,\n\
\ \"acc_norm_stderr\": 0.01749790503715936\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.38439306358381503,\n \"acc_stderr\": 0.026189666966272028,\n\
\ \"acc_norm\": 0.38439306358381503,\n \"acc_norm_stderr\": 0.026189666966272028\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n\
\ \"acc_stderr\": 0.014593620923210725,\n \"acc_norm\": 0.2558659217877095,\n\
\ \"acc_norm_stderr\": 0.014593620923210725\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.028452639985088003,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.028452639985088003\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.39228295819935693,\n\
\ \"acc_stderr\": 0.027731258647012,\n \"acc_norm\": 0.39228295819935693,\n\
\ \"acc_norm_stderr\": 0.027731258647012\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.36728395061728397,\n \"acc_stderr\": 0.0268228017595079,\n\
\ \"acc_norm\": 0.36728395061728397,\n \"acc_norm_stderr\": 0.0268228017595079\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2872340425531915,\n \"acc_stderr\": 0.026992199173064356,\n \
\ \"acc_norm\": 0.2872340425531915,\n \"acc_norm_stderr\": 0.026992199173064356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.30378096479791394,\n\
\ \"acc_stderr\": 0.011745787720472457,\n \"acc_norm\": 0.30378096479791394,\n\
\ \"acc_norm_stderr\": 0.011745787720472457\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.33455882352941174,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.33455882352941174,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.31862745098039214,\n \"acc_stderr\": 0.01885008469646871,\n \
\ \"acc_norm\": 0.31862745098039214,\n \"acc_norm_stderr\": 0.01885008469646871\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.43636363636363634,\n\
\ \"acc_stderr\": 0.04750185058907297,\n \"acc_norm\": 0.43636363636363634,\n\
\ \"acc_norm_stderr\": 0.04750185058907297\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.45714285714285713,\n \"acc_stderr\": 0.03189141832421396,\n\
\ \"acc_norm\": 0.45714285714285713,\n \"acc_norm_stderr\": 0.03189141832421396\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5024875621890548,\n\
\ \"acc_stderr\": 0.03535490150137288,\n \"acc_norm\": 0.5024875621890548,\n\
\ \"acc_norm_stderr\": 0.03535490150137288\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.37349397590361444,\n\
\ \"acc_stderr\": 0.03765845117168863,\n \"acc_norm\": 0.37349397590361444,\n\
\ \"acc_norm_stderr\": 0.03765845117168863\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3391812865497076,\n \"acc_stderr\": 0.03631053496488905,\n\
\ \"acc_norm\": 0.3391812865497076,\n \"acc_norm_stderr\": 0.03631053496488905\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n\
\ \"mc1_stderr\": 0.014761945174862677,\n \"mc2\": 0.40486982374224917,\n\
\ \"mc2_stderr\": 0.014446194947322915\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5453827940015785,\n \"acc_stderr\": 0.013994481027065991\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19636087945413191,\n \
\ \"acc_stderr\": 0.010942090791564753\n }\n}\n```"
repo_url: https://huggingface.co/bigcode/starcoder2-3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|arc:challenge|25_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|gsm8k|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hellaswag|10_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-35-05.231245.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T20-35-05.231245.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- '**/details_harness|winogrande|5_2024-04-02T20-35-05.231245.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T20-35-05.231245.parquet'
- config_name: results
data_files:
- split: 2024_04_02T20_35_05.231245
path:
- results_2024-04-02T20-35-05.231245.parquet
- split: latest
path:
- results_2024-04-02T20-35-05.231245.parquet
---
# Dataset Card for Evaluation run of bigcode/starcoder2-3b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bigcode/starcoder2-3b](https://huggingface.co/bigcode/starcoder2-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigcode__starcoder2-3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T20:35:05.231245](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoder2-3b/blob/main/results_2024-04-02T20-35-05.231245.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.38461948986228506,
"acc_stderr": 0.034446036095792894,
"acc_norm": 0.3873129877490902,
"acc_norm_stderr": 0.03519964377942068,
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862677,
"mc2": 0.40486982374224917,
"mc2_stderr": 0.014446194947322915
},
"harness|arc:challenge|25": {
"acc": 0.31143344709897613,
"acc_stderr": 0.013532472099850949,
"acc_norm": 0.3455631399317406,
"acc_norm_stderr": 0.01389693846114568
},
"harness|hellaswag|10": {
"acc": 0.37890858394742083,
"acc_stderr": 0.004841238763529369,
"acc_norm": 0.4761999601672974,
"acc_norm_stderr": 0.00498412536331906
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40131578947368424,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.40131578947368424,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3584905660377358,
"acc_stderr": 0.029514703583981762,
"acc_norm": 0.3584905660377358,
"acc_norm_stderr": 0.029514703583981762
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.04032999053960718,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.04032999053960718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.035995863012470784,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.035995863012470784
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4032258064516129,
"acc_stderr": 0.02790615082604114,
"acc_norm": 0.4032258064516129,
"acc_norm_stderr": 0.02790615082604114
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.38181818181818183,
"acc_stderr": 0.037937131711656344,
"acc_norm": 0.38181818181818183,
"acc_norm_stderr": 0.037937131711656344
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5252525252525253,
"acc_stderr": 0.035578062450873145,
"acc_norm": 0.5252525252525253,
"acc_norm_stderr": 0.035578062450873145
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37823834196891193,
"acc_stderr": 0.03499807276193338,
"acc_norm": 0.37823834196891193,
"acc_norm_stderr": 0.03499807276193338
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.33589743589743587,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.33589743589743587,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114982,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114982
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36134453781512604,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.36134453781512604,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.363302752293578,
"acc_stderr": 0.020620603919625804,
"acc_norm": 0.363302752293578,
"acc_norm_stderr": 0.020620603919625804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.03454236585380609,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.03454236585380609
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3755274261603376,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.3755274261603376,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3632286995515695,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.3632286995515695,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4580152671755725,
"acc_stderr": 0.04369802690578757,
"acc_norm": 0.4580152671755725,
"acc_norm_stderr": 0.04369802690578757
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.49586776859504134,
"acc_stderr": 0.045641987674327526,
"acc_norm": 0.49586776859504134,
"acc_norm_stderr": 0.045641987674327526
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3987730061349693,
"acc_stderr": 0.03847021420456023,
"acc_norm": 0.3987730061349693,
"acc_norm_stderr": 0.03847021420456023
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.46601941747572817,
"acc_stderr": 0.0493929144727348,
"acc_norm": 0.46601941747572817,
"acc_norm_stderr": 0.0493929144727348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.03187195347942466,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.03187195347942466
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.39719029374201786,
"acc_stderr": 0.01749790503715936,
"acc_norm": 0.39719029374201786,
"acc_norm_stderr": 0.01749790503715936
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.38439306358381503,
"acc_stderr": 0.026189666966272028,
"acc_norm": 0.38439306358381503,
"acc_norm_stderr": 0.026189666966272028
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2558659217877095,
"acc_stderr": 0.014593620923210725,
"acc_norm": 0.2558659217877095,
"acc_norm_stderr": 0.014593620923210725
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.028452639985088003,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.028452639985088003
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.39228295819935693,
"acc_stderr": 0.027731258647012,
"acc_norm": 0.39228295819935693,
"acc_norm_stderr": 0.027731258647012
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.36728395061728397,
"acc_stderr": 0.0268228017595079,
"acc_norm": 0.36728395061728397,
"acc_norm_stderr": 0.0268228017595079
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2872340425531915,
"acc_stderr": 0.026992199173064356,
"acc_norm": 0.2872340425531915,
"acc_norm_stderr": 0.026992199173064356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.30378096479791394,
"acc_stderr": 0.011745787720472457,
"acc_norm": 0.30378096479791394,
"acc_norm_stderr": 0.011745787720472457
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.33455882352941174,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.33455882352941174,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.31862745098039214,
"acc_stderr": 0.01885008469646871,
"acc_norm": 0.31862745098039214,
"acc_norm_stderr": 0.01885008469646871
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.43636363636363634,
"acc_stderr": 0.04750185058907297,
"acc_norm": 0.43636363636363634,
"acc_norm_stderr": 0.04750185058907297
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.45714285714285713,
"acc_stderr": 0.03189141832421396,
"acc_norm": 0.45714285714285713,
"acc_norm_stderr": 0.03189141832421396
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5024875621890548,
"acc_stderr": 0.03535490150137288,
"acc_norm": 0.5024875621890548,
"acc_norm_stderr": 0.03535490150137288
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-virology|5": {
"acc": 0.37349397590361444,
"acc_stderr": 0.03765845117168863,
"acc_norm": 0.37349397590361444,
"acc_norm_stderr": 0.03765845117168863
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3391812865497076,
"acc_stderr": 0.03631053496488905,
"acc_norm": 0.3391812865497076,
"acc_norm_stderr": 0.03631053496488905
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862677,
"mc2": 0.40486982374224917,
"mc2_stderr": 0.014446194947322915
},
"harness|winogrande|5": {
"acc": 0.5453827940015785,
"acc_stderr": 0.013994481027065991
},
"harness|gsm8k|5": {
"acc": 0.19636087945413191,
"acc_stderr": 0.010942090791564753
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
gokuls/glue_augmented_qqp | ---
license: apache-2.0
---
# Dataset Card for glue_augmented_qqp
## Dataset Description
Augmented QQP dataset
**Reference:** https://huggingface.co/datasets/glue |
dvssr/umls | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 23406188857
num_examples: 150358860
download_size: 12904703274
dataset_size: 23406188857
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
freshpearYoon/vr_train_free_13 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 6560256311
num_examples: 10000
download_size: 1024162653
dataset_size: 6560256311
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
trojblue/public_data | ---
license: bigscience-openrail-m
---
kndSet & kndSet_good_only: 宵崎奏 (~160p; 原图)
yada_train_v1: ai生成图片, 含bad anatomy tagging (1024*1560; 原图)
onimai:
- danbooru + wd tags, 按概率排序后去重:
- `onii-chan wa oshimai!` → `onimai`
- `oyama mahiro`, `hozuki kaede`, `oyama mihari` |
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v3 | ---
dataset_info:
features:
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 5071934526
num_examples: 1000
download_size: 930750941
dataset_size: 5071934526
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/med_alpaca_standardized_cluster_61_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 19056755
num_examples: 40316
download_size: 9733089
dataset_size: 19056755
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_61_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sordonia/flan-10k-flat | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: task_name
dtype: string
- name: task_source
dtype: string
- name: template_type
dtype: string
- name: template_idx
dtype: int64
- name: split
dtype: string
splits:
- name: train
num_bytes: 16815984887
num_examples: 10912677
download_size: 6978956537
dataset_size: 16815984887
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "flan-10k-flat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tiennv/mmarco-passage-vi | ---
dataset_info:
features:
- name: query_id
dtype: int64
- name: query
dtype: string
- name: positive_id
dtype: int64
- name: positive
dtype: string
- name: negatives
sequence: string
splits:
- name: train
num_bytes: 11894387626
num_examples: 415936
download_size: 5402037391
dataset_size: 11894387626
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "mmarco-passage-vi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dart | ---
annotations_creators:
- crowdsourced
- machine-generated
language_creators:
- crowdsourced
- machine-generated
language:
- en
license:
- mit
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|wikitable_questions
- extended|wikisql
- extended|web_nlg
- extended|cleaned_e2e
task_categories:
- tabular-to-text
task_ids:
- rdf-to-text
paperswithcode_id: dart
pretty_name: DART
dataset_info:
features:
- name: tripleset
sequence:
sequence: string
- name: subtree_was_extended
dtype: bool
- name: annotations
sequence:
- name: source
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 12966443
num_examples: 30526
- name: validation
num_bytes: 1458106
num_examples: 2768
- name: test
num_bytes: 2657644
num_examples: 5097
download_size: 29939366
dataset_size: 17082193
---
# Dataset Card for DART
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [homepahe](https://github.com/Yale-LILY/dart)
- **Repository:** [github](https://github.com/Yale-LILY/dart)
- **Paper:** [paper](https://arxiv.org/abs/2007.02871)
- **Leaderboard:** [leaderboard](https://github.com/Yale-LILY/dart#leaderboard)
### Dataset Summary
DART is a large dataset for open-domain structured data record to text generation. We consider the structured data record input as a set of RDF entity-relation triples, a format widely used for knowledge representation and semantics description. DART consists of 82,191 examples across different domains with each input being a semantic RDF triple set derived from data records in tables and the tree ontology of the schema, annotated with sentence descriptions that cover all facts in the triple set. This hierarchical, structured format with its open-domain nature differentiates DART from other existing table-to-text corpora.
### Supported Tasks and Leaderboards
The task associated to DART is text generation from data records that are RDF triplets:
- `rdf-to-text`: The dataset can be used to train a model for text generation from RDF triplets, which consists in generating textual description of structured data. Success on this task is typically measured by achieving a *high* [BLEU](https://huggingface.co/metrics/bleu), [METEOR](https://huggingface.co/metrics/meteor), [BLEURT](https://huggingface.co/metrics/bleurt), [TER](https://huggingface.co/metrics/ter), [MoverScore](https://huggingface.co/metrics/mover_score), and [BERTScore](https://huggingface.co/metrics/bert_score). The ([BART-large model](https://huggingface.co/facebook/bart-large) from [BART](https://huggingface.co/transformers/model_doc/bart.html)) model currently achieves the following scores:
| | BLEU | METEOR | TER | MoverScore | BERTScore | BLEURT |
| ----- | ----- | ------ | ---- | ----------- | ---------- | ------ |
| BART | 37.06 | 0.36 | 0.57 | 0.44 | 0.92 | 0.22 |
This task has an active leaderboard which can be found [here](https://github.com/Yale-LILY/dart#leaderboard) and ranks models based on the above metrics while also reporting.
### Languages
The dataset is in english (en).
## Dataset Structure
### Data Instances
Here is an example from the dataset:
```
{'annotations': {'source': ['WikiTableQuestions_mturk'],
'text': ['First Clearing\tbased on Callicoon, New York and location at On NYS 52 1 Mi. Youngsville']},
'subtree_was_extended': False,
'tripleset': [['First Clearing', 'LOCATION', 'On NYS 52 1 Mi. Youngsville'],
['On NYS 52 1 Mi. Youngsville', 'CITY_OR_TOWN', 'Callicoon, New York']]}
```
It contains one annotation where the textual description is 'First Clearing\tbased on Callicoon, New York and location at On NYS 52 1 Mi. Youngsville'. The RDF triplets considered to generate this description are in tripleset and are formatted as subject, predicate, object.
### Data Fields
The different fields are:
- `annotations`:
- `text`: list of text descriptions of the triplets
- `source`: list of sources of the RDF triplets (WikiTable, e2e, etc.)
- `subtree_was_extended`: boolean, if the subtree condidered during the dataset construction was extended. Sometimes this field is missing, and therefore set to `None`
- `tripleset`: RDF triplets as a list of triplets of strings (subject, predicate, object)
### Data Splits
There are three splits, train, validation and test:
| | train | validation | test |
| ----- |------:|-----------:|-----:|
| N. Examples | 30526 | 2768 | 6959 |
## Dataset Creation
### Curation Rationale
Automatically generating textual descriptions from structured data inputs is crucial to improving the accessibility of knowledge bases to lay users.
### Source Data
DART comes from existing datasets that cover a variety of different domains while allowing to build a tree ontology and form RDF triple sets as semantic representations. The datasets used are WikiTableQuestions, WikiSQL, WebNLG and Cleaned E2E.
#### Initial Data Collection and Normalization
DART is constructed using multiple complementary methods: (1) human annotation on open-domain Wikipedia tables
from WikiTableQuestions (Pasupat and Liang, 2015) and WikiSQL (Zhong et al., 2017), (2) automatic conversion of questions in WikiSQL to declarative sentences, and (3) incorporation of existing datasets including WebNLG 2017 (Gardent et al., 2017a,b; Shimorina and Gardent, 2018) and Cleaned E2E (Novikova et al., 2017b; Dušek et al., 2018, 2019)
#### Who are the source language producers?
[More Information Needed]
### Annotations
DART is constructed using multiple complementary methods: (1) human annotation on open-domain Wikipedia tables
from WikiTableQuestions (Pasupat and Liang, 2015) and WikiSQL (Zhong et al., 2017), (2) automatic conversion of questions in WikiSQL to declarative sentences, and (3) incorporation of existing datasets including WebNLG 2017 (Gardent et al., 2017a,b; Shimorina and Gardent, 2018) and Cleaned E2E (Novikova et al., 2017b; Dušek et al., 2018, 2019)
#### Annotation process
The two stage annotation process for constructing tripleset sentence pairs is based on a tree-structured ontology of each table.
First, internal skilled annotators denote the parent column for each column header.
Then, a larger number of annotators provide a sentential description of an automatically-chosen subset of table cells in a row.
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Under MIT license (see [here](https://github.com/Yale-LILY/dart/blob/master/LICENSE))
### Citation Information
```
@article{radev2020dart,
title={DART: Open-Domain Structured Data Record to Text Generation},
author={Dragomir Radev and Rui Zhang and Amrit Rau and Abhinand Sivaprasad and Chiachun Hsieh and Nazneen Fatema Rajani and Xiangru Tang and Aadit Vyas and Neha Verma and Pranav Krishna and Yangxiaokang Liu and Nadia Irwanto and Jessica Pan and Faiaz Rahman and Ahmad Zaidi and Murori Mutuma and Yasin Tarabar and Ankit Gupta and Tao Yu and Yi Chern Tan and Xi Victoria Lin and Caiming Xiong and Richard Socher},
journal={arXiv preprint arXiv:2007.02871},
year={2020}
```
### Contributions
Thanks to [@lhoestq](https://github.com/lhoestq) for adding this dataset. |
Lisandro2/teste | ---
license: openrail
task_categories:
- text-classification
language:
- pt
tags:
- finance
pretty_name: teste_liso
size_categories:
- n<1K
--- |
gguichard/wsd_myriade_synth_data_multilabel | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: wn_sens
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: float64
splits:
- name: train
num_bytes: 26227913
num_examples: 39518
download_size: 6754273
dataset_size: 26227913
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mHossain/final_train_v4_test_800000 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 6685194.6
num_examples: 18000
- name: test
num_bytes: 742799.4
num_examples: 2000
download_size: 3208395
dataset_size: 7427994.0
---
# Dataset Card for "final_train_v4_test_800000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ftshijt/ML_SUPERB | ---
license: apache-2.0
---
|
jpawan33/fkr30k-image-captioning-dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1625135945.608
num_examples: 31782
download_size: 1621386563
dataset_size: 1625135945.608
---
# Dataset Card for "fkr30k-image-captioning-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_liminerity__Phigments12 | ---
pretty_name: Evaluation run of liminerity/Phigments12
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [liminerity/Phigments12](https://huggingface.co/liminerity/Phigments12) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__Phigments12\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T17:32:17.179865](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Phigments12/blob/main/results_2024-03-09T17-32-17.179865.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5876134592184494,\n\
\ \"acc_stderr\": 0.033687338477498184,\n \"acc_norm\": 0.5881587552403912,\n\
\ \"acc_norm_stderr\": 0.0343790096457623,\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5170709953785974,\n\
\ \"mc2_stderr\": 0.015471659939020242\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5955631399317406,\n \"acc_stderr\": 0.01434203648343618,\n\
\ \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759077\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.584744074885481,\n\
\ \"acc_stderr\": 0.004917590378138211,\n \"acc_norm\": 0.7709619597689703,\n\
\ \"acc_norm_stderr\": 0.004193549762600656\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.030052580579557845,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.030052580579557845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.037336266553835096,\n\
\ \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.037336266553835096\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n\
\ \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n\
\ \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4978723404255319,\n\
\ \"acc_stderr\": 0.032685726586674915,\n \"acc_norm\": 0.4978723404255319,\n\
\ \"acc_norm_stderr\": 0.032685726586674915\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.35964912280701755,\n \"acc_stderr\": 0.045144961328736334,\n\
\ \"acc_norm\": 0.35964912280701755,\n \"acc_norm_stderr\": 0.045144961328736334\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\"\
: 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.0255428468174005\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.04375888492727061,\n\
\ \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.04375888492727061\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.6967741935483871,\n \"acc_stderr\": 0.02614868593067175,\n\
\ \"acc_norm\": 0.6967741935483871,\n \"acc_norm_stderr\": 0.02614868593067175\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\"\
: 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.035243908445117815,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.035243908445117815\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.031353050095330855,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.031353050095330855\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.028979089794296732,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.028979089794296732\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.02483881198803316,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.02483881198803316\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.0287420409039485,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.0287420409039485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515001,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515001\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8201834862385321,\n \"acc_stderr\": 0.01646534546739153,\n \"\
acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.01646534546739153\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6764705882352942,\n \"acc_stderr\": 0.03283472056108561,\n \"\
acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03283472056108561\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677697,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677697\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.0413311944024384,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.0413311944024384\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\
\ \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.8162393162393162,\n\
\ \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6960408684546615,\n\
\ \"acc_stderr\": 0.01644832168676905,\n \"acc_norm\": 0.6960408684546615,\n\
\ \"acc_norm_stderr\": 0.01644832168676905\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879695,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879695\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.293854748603352,\n\
\ \"acc_stderr\": 0.015235075776719616,\n \"acc_norm\": 0.293854748603352,\n\
\ \"acc_norm_stderr\": 0.015235075776719616\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.027732834353363937,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.027732834353363937\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200868,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200868\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719967,\n\
\ \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719967\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n\
\ \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.4211212516297262,\n\
\ \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5375816993464052,\n \"acc_stderr\": 0.020170614974969765,\n \
\ \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.020170614974969765\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.031157150869355575,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.031157150869355575\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932264,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932264\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.035469769593931624,\n\
\ \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.035469769593931624\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5170709953785974,\n\
\ \"mc2_stderr\": 0.015471659939020242\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7466456195737964,\n \"acc_stderr\": 0.012223754434233618\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6133434420015162,\n \
\ \"acc_stderr\": 0.013413955095965309\n }\n}\n```"
repo_url: https://huggingface.co/liminerity/Phigments12
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|arc:challenge|25_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|gsm8k|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hellaswag|10_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T17-32-17.179865.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T17-32-17.179865.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- '**/details_harness|winogrande|5_2024-03-09T17-32-17.179865.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T17-32-17.179865.parquet'
- config_name: results
data_files:
- split: 2024_03_09T17_32_17.179865
path:
- results_2024-03-09T17-32-17.179865.parquet
- split: latest
path:
- results_2024-03-09T17-32-17.179865.parquet
---
# Dataset Card for Evaluation run of liminerity/Phigments12
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/Phigments12](https://huggingface.co/liminerity/Phigments12) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__Phigments12",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T17:32:17.179865](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Phigments12/blob/main/results_2024-03-09T17-32-17.179865.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5876134592184494,
"acc_stderr": 0.033687338477498184,
"acc_norm": 0.5881587552403912,
"acc_norm_stderr": 0.0343790096457623,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5170709953785974,
"mc2_stderr": 0.015471659939020242
},
"harness|arc:challenge|25": {
"acc": 0.5955631399317406,
"acc_stderr": 0.01434203648343618,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759077
},
"harness|hellaswag|10": {
"acc": 0.584744074885481,
"acc_stderr": 0.004917590378138211,
"acc_norm": 0.7709619597689703,
"acc_norm_stderr": 0.004193549762600656
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.030052580579557845,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.030052580579557845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.02614868593067175,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.02614868593067175
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.035243908445117815,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.035243908445117815
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.031353050095330855,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.031353050095330855
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.028979089794296732,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.028979089794296732
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6,
"acc_stderr": 0.02483881198803316,
"acc_norm": 0.6,
"acc_norm_stderr": 0.02483881198803316
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0287420409039485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0287420409039485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515001,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515001
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.01646534546739153,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.01646534546739153
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03283472056108561,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03283472056108561
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677697,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677697
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.0413311944024384,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.0413311944024384
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.025372139671722933,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.025372139671722933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6960408684546615,
"acc_stderr": 0.01644832168676905,
"acc_norm": 0.6960408684546615,
"acc_norm_stderr": 0.01644832168676905
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879695,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879695
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.293854748603352,
"acc_stderr": 0.015235075776719616,
"acc_norm": 0.293854748603352,
"acc_norm_stderr": 0.015235075776719616
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.027732834353363937,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.027732834353363937
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200868,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200868
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719967,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719967
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4211212516297262,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.4211212516297262,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.020170614974969765,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.020170614974969765
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355575,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355575
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932264,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932264
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5170709953785974,
"mc2_stderr": 0.015471659939020242
},
"harness|winogrande|5": {
"acc": 0.7466456195737964,
"acc_stderr": 0.012223754434233618
},
"harness|gsm8k|5": {
"acc": 0.6133434420015162,
"acc_stderr": 0.013413955095965309
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
SaladSlayer00/twin_matcher_data | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 365466216.282
num_examples: 9067
- name: test
num_bytes: 4071086.0
num_examples: 106
download_size: 368598774
dataset_size: 369537302.282
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
melifay/turkishReviews-ds-mini | ---
dataset_info:
features:
- name: review
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 1252876.2642514652
num_examples: 3378
- name: validation
num_bytes: 139455.7357485349
num_examples: 376
download_size: 896651
dataset_size: 1392332.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
sukantan/nyaya-st-test | ---
dataset_info:
features:
- name: test_id
dtype: string
- name: act
dtype: string
- name: section_no
dtype: string
- name: case_matter
dtype: string
splits:
- name: train
num_bytes: 94271
num_examples: 492
download_size: 45018
dataset_size: 94271
---
# Dataset Card for "nyaya-st-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
openlifescienceai/mmlu_anatomy | ---
dataset_info:
features:
- name: subject_name
dtype: string
- name: data
struct:
- name: Correct Answer
dtype: string
- name: Correct Option
dtype: string
- name: Options
struct:
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: Question
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 44083
num_examples: 135
- name: validation
num_bytes: 4252
num_examples: 14
- name: dev
num_bytes: 1323
num_examples: 5
download_size: 52092
dataset_size: 49658
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
- split: dev
path: data/dev-*
---
|
allganize/flare-fiqasa-ko | ---
dataset_info:
features:
- name: conversation_id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: test
num_bytes: 52262
num_examples: 204
download_size: 19986
dataset_size: 52262
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
license: mit
language:
- ko
---
# flare-fiqasa-ko
### 데이터 설명
- `flare-fiqasa-ko` 데이터는 금융 도메인 뉴스 헤드라인의 감성을 예측(sentiment analysis)하는 데이터셋입니다.
입력값은 text로만 이루어져 있습니다.
- 한국어 데이터를 생성하기 위해, 우선 사내 언어 번역 모델 Allganize Translator을 활용하여 [ChanceFocus/flare-fiqasa](https://huggingface.co/datasets/ChanceFocus/flare-fiqasa)의 test set을 번역했습니다.
오역된 데이터를 직접 제거하였고, 그 결과 204개의 평가 데이터가 생성되었습니다.
### 데이터 출처
- [ChanceFocus/flare-fiqasa](https://huggingface.co/datasets/ChanceFocus/flare-fiqasa)
### 데이터 예시
```
{
'conversation_id': 'fiqasa938',
'conversations': array([
{
'from': 'human',
'value': '''다음 재무 게시물의 감정은 무엇인가요? 긍정, 부정 또는 중립인가요?
텍스트: $BBRY 실제로 부채가 없고 현금 3.1달러를 포함하면 주당 0.03달러의 손실을 입었습니다.
정답:'''
},
{
'from': 'gpt',
'value': '부정'
}
], dtype=object)
}
``` |
Taskin123/Classification-2 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct | ---
pretty_name: Evaluation run of kyujinpy/Sakura-SOLAR-Instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kyujinpy/Sakura-SOLAR-Instruct](https://huggingface.co/kyujinpy/Sakura-SOLAR-Instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-27T14:31:12.994833](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct/blob/main/results_2023-12-27T14-31-12.994833.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6670794648633737,\n\
\ \"acc_stderr\": 0.03162151337270039,\n \"acc_norm\": 0.6678288182149681,\n\
\ \"acc_norm_stderr\": 0.03226675533800617,\n \"mc1\": 0.5703794369645043,\n\
\ \"mc1_stderr\": 0.017329234580409095,\n \"mc2\": 0.7178732393039171,\n\
\ \"mc2_stderr\": 0.01499862160665204\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6825938566552902,\n \"acc_stderr\": 0.013602239088038167,\n\
\ \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520767\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7128062139016133,\n\
\ \"acc_stderr\": 0.004515280911468822,\n \"acc_norm\": 0.8841864170483967,\n\
\ \"acc_norm_stderr\": 0.0031934725302821703\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6297872340425532,\n \"acc_stderr\": 0.03156564682236786,\n\
\ \"acc_norm\": 0.6297872340425532,\n \"acc_norm_stderr\": 0.03156564682236786\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4947089947089947,\n \"acc_stderr\": 0.02574986828855657,\n \"\
acc_norm\": 0.4947089947089947,\n \"acc_norm_stderr\": 0.02574986828855657\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.021886178567172534,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.021886178567172534\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603347,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603347\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37407407407407406,\n \"acc_stderr\": 0.029502861128955286,\n \
\ \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.029502861128955286\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136094,\n\
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136094\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.033622774366080424,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.033622774366080424\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8523206751054853,\n \"acc_stderr\": 0.0230943295825957,\n \
\ \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.0230943295825957\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.014143970276657567,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.014143970276657567\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123567,\n\
\ \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123567\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39553072625698327,\n\
\ \"acc_stderr\": 0.016353415410075775,\n \"acc_norm\": 0.39553072625698327,\n\
\ \"acc_norm_stderr\": 0.016353415410075775\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49282920469361147,\n\
\ \"acc_stderr\": 0.012768922739553308,\n \"acc_norm\": 0.49282920469361147,\n\
\ \"acc_norm_stderr\": 0.012768922739553308\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.02655651947004151,\n\
\ \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.02655651947004151\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5703794369645043,\n\
\ \"mc1_stderr\": 0.017329234580409095,\n \"mc2\": 0.7178732393039171,\n\
\ \"mc2_stderr\": 0.01499862160665204\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273766\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6520090978013646,\n \
\ \"acc_stderr\": 0.013120581030382132\n }\n}\n```"
repo_url: https://huggingface.co/kyujinpy/Sakura-SOLAR-Instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|arc:challenge|25_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|gsm8k|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hellaswag|10_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T14-31-12.994833.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-27T14-31-12.994833.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- '**/details_harness|winogrande|5_2023-12-27T14-31-12.994833.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-27T14-31-12.994833.parquet'
- config_name: results
data_files:
- split: 2023_12_27T14_31_12.994833
path:
- results_2023-12-27T14-31-12.994833.parquet
- split: latest
path:
- results_2023-12-27T14-31-12.994833.parquet
---
# Dataset Card for Evaluation run of kyujinpy/Sakura-SOLAR-Instruct
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kyujinpy/Sakura-SOLAR-Instruct](https://huggingface.co/kyujinpy/Sakura-SOLAR-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-27T14:31:12.994833](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct/blob/main/results_2023-12-27T14-31-12.994833.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6670794648633737,
"acc_stderr": 0.03162151337270039,
"acc_norm": 0.6678288182149681,
"acc_norm_stderr": 0.03226675533800617,
"mc1": 0.5703794369645043,
"mc1_stderr": 0.017329234580409095,
"mc2": 0.7178732393039171,
"mc2_stderr": 0.01499862160665204
},
"harness|arc:challenge|25": {
"acc": 0.6825938566552902,
"acc_stderr": 0.013602239088038167,
"acc_norm": 0.7098976109215017,
"acc_norm_stderr": 0.013261573677520767
},
"harness|hellaswag|10": {
"acc": 0.7128062139016133,
"acc_stderr": 0.004515280911468822,
"acc_norm": 0.8841864170483967,
"acc_norm_stderr": 0.0031934725302821703
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6297872340425532,
"acc_stderr": 0.03156564682236786,
"acc_norm": 0.6297872340425532,
"acc_norm_stderr": 0.03156564682236786
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4947089947089947,
"acc_stderr": 0.02574986828855657,
"acc_norm": 0.4947089947089947,
"acc_norm_stderr": 0.02574986828855657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172534,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172534
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603347,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603347
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.029502861128955286,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.029502861128955286
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136094,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136094
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669235,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.033622774366080424,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.033622774366080424
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.0230943295825957,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.0230943295825957
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657567,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657567
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39553072625698327,
"acc_stderr": 0.016353415410075775,
"acc_norm": 0.39553072625698327,
"acc_norm_stderr": 0.016353415410075775
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49282920469361147,
"acc_stderr": 0.012768922739553308,
"acc_norm": 0.49282920469361147,
"acc_norm_stderr": 0.012768922739553308
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.02655651947004151,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.02655651947004151
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5703794369645043,
"mc1_stderr": 0.017329234580409095,
"mc2": 0.7178732393039171,
"mc2_stderr": 0.01499862160665204
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273766
},
"harness|gsm8k|5": {
"acc": 0.6520090978013646,
"acc_stderr": 0.013120581030382132
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_leonarad__hope_for | ---
pretty_name: Evaluation run of leonarad/hope_for
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [leonarad/hope_for](https://huggingface.co/leonarad/hope_for) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_leonarad__hope_for\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-08T02:11:21.059412](https://huggingface.co/datasets/open-llm-leaderboard/details_leonarad__hope_for/blob/main/results_2024-03-08T02-11-21.059412.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5132979348388889,\n\
\ \"acc_stderr\": 0.03415692024039109,\n \"acc_norm\": 0.5194765537743715,\n\
\ \"acc_norm_stderr\": 0.03491704332563057,\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.015298077509485081,\n \"mc2\": 0.4072751446221107,\n\
\ \"mc2_stderr\": 0.01398560051759931\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4735494880546075,\n \"acc_stderr\": 0.014590931358120177,\n\
\ \"acc_norm\": 0.5127986348122867,\n \"acc_norm_stderr\": 0.014606603181012538\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5534754033061143,\n\
\ \"acc_stderr\": 0.004961161589228411,\n \"acc_norm\": 0.747361083449512,\n\
\ \"acc_norm_stderr\": 0.004336375492801793\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.040463368839782514,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.040463368839782514\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.0303650508291152,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.0303650508291152\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n\
\ \"acc_stderr\": 0.038073017265045105,\n \"acc_norm\": 0.47398843930635837,\n\
\ \"acc_norm_stderr\": 0.038073017265045105\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425072,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557836,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557836\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5612903225806452,\n \"acc_stderr\": 0.028229497320317213,\n \"\
acc_norm\": 0.5612903225806452,\n \"acc_norm_stderr\": 0.028229497320317213\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3399014778325123,\n \"acc_stderr\": 0.033327690684107895,\n \"\
acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.033327690684107895\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391245,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391245\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6616161616161617,\n \"acc_stderr\": 0.03371124142626302,\n \"\
acc_norm\": 0.6616161616161617,\n \"acc_norm_stderr\": 0.03371124142626302\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.03292296639155141,\n\
\ \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.03292296639155141\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4564102564102564,\n \"acc_stderr\": 0.025254485424799605,\n\
\ \"acc_norm\": 0.4564102564102564,\n \"acc_norm_stderr\": 0.025254485424799605\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4495798319327731,\n \"acc_stderr\": 0.03231293497137707,\n \
\ \"acc_norm\": 0.4495798319327731,\n \"acc_norm_stderr\": 0.03231293497137707\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.726605504587156,\n \"acc_stderr\": 0.01910929984609829,\n \"acc_norm\"\
: 0.726605504587156,\n \"acc_norm_stderr\": 0.01910929984609829\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.033247089118091176,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.033247089118091176\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.033321399446680854,\n\
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.033321399446680854\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842534,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842534\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.57847533632287,\n\
\ \"acc_stderr\": 0.03314190222110657,\n \"acc_norm\": 0.57847533632287,\n\
\ \"acc_norm_stderr\": 0.03314190222110657\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"\
acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.0471282125742677,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.0471282125742677\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5214723926380368,\n \"acc_stderr\": 0.03924746876751129,\n\
\ \"acc_norm\": 0.5214723926380368,\n \"acc_norm_stderr\": 0.03924746876751129\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n\
\ \"acc_stderr\": 0.027046857630716677,\n \"acc_norm\": 0.782051282051282,\n\
\ \"acc_norm_stderr\": 0.027046857630716677\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6883780332056194,\n\
\ \"acc_stderr\": 0.016562433867284176,\n \"acc_norm\": 0.6883780332056194,\n\
\ \"acc_norm_stderr\": 0.016562433867284176\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.026817718130348923,\n\
\ \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.026817718130348923\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2569832402234637,\n\
\ \"acc_stderr\": 0.014614465821966318,\n \"acc_norm\": 0.2569832402234637,\n\
\ \"acc_norm_stderr\": 0.014614465821966318\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5751633986928104,\n \"acc_stderr\": 0.028304576673141107,\n\
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.028304576673141107\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.027604689028581986,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.027604689028581986\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5524691358024691,\n \"acc_stderr\": 0.02766713856942271,\n\
\ \"acc_norm\": 0.5524691358024691,\n \"acc_norm_stderr\": 0.02766713856942271\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3644067796610169,\n\
\ \"acc_stderr\": 0.012291694983056477,\n \"acc_norm\": 0.3644067796610169,\n\
\ \"acc_norm_stderr\": 0.012291694983056477\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5036764705882353,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.020227834851568375,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.020227834851568375\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6204081632653061,\n\
\ \"acc_stderr\": 0.031067211262872475,\n \"acc_norm\": 0.6204081632653061,\n\
\ \"acc_norm_stderr\": 0.031067211262872475\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.7562189054726368,\n \"acc_stderr\": 0.030360490154014638,\n\
\ \"acc_norm\": 0.7562189054726368,\n \"acc_norm_stderr\": 0.030360490154014638\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.038743715565879536,\n\
\ \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.038743715565879536\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7251461988304093,\n\
\ \"acc_stderr\": 0.03424042924691584,\n \"acc_norm\": 0.7251461988304093,\n\
\ \"acc_norm_stderr\": 0.03424042924691584\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.25703794369645044,\n \"mc1_stderr\": 0.015298077509485081,\n\
\ \"mc2\": 0.4072751446221107,\n \"mc2_stderr\": 0.01398560051759931\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.7261247040252565,\n\
\ \"acc_stderr\": 0.012533292732620296\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.16906747536012132,\n \"acc_stderr\": 0.010324171445497361\n\
\ }\n}\n```"
repo_url: https://huggingface.co/leonarad/hope_for
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|arc:challenge|25_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|gsm8k|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hellaswag|10_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-08T02-11-21.059412.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-08T02-11-21.059412.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- '**/details_harness|winogrande|5_2024-03-08T02-11-21.059412.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-08T02-11-21.059412.parquet'
- config_name: results
data_files:
- split: 2024_03_08T02_11_21.059412
path:
- results_2024-03-08T02-11-21.059412.parquet
- split: latest
path:
- results_2024-03-08T02-11-21.059412.parquet
---
# Dataset Card for Evaluation run of leonarad/hope_for
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [leonarad/hope_for](https://huggingface.co/leonarad/hope_for) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_leonarad__hope_for",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-08T02:11:21.059412](https://huggingface.co/datasets/open-llm-leaderboard/details_leonarad__hope_for/blob/main/results_2024-03-08T02-11-21.059412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5132979348388889,
"acc_stderr": 0.03415692024039109,
"acc_norm": 0.5194765537743715,
"acc_norm_stderr": 0.03491704332563057,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.015298077509485081,
"mc2": 0.4072751446221107,
"mc2_stderr": 0.01398560051759931
},
"harness|arc:challenge|25": {
"acc": 0.4735494880546075,
"acc_stderr": 0.014590931358120177,
"acc_norm": 0.5127986348122867,
"acc_norm_stderr": 0.014606603181012538
},
"harness|hellaswag|10": {
"acc": 0.5534754033061143,
"acc_stderr": 0.004961161589228411,
"acc_norm": 0.747361083449512,
"acc_norm_stderr": 0.004336375492801793
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.040463368839782514,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.040463368839782514
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.0303650508291152,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.0303650508291152
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.038073017265045105,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.038073017265045105
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207763,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207763
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425072,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557836,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557836
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5612903225806452,
"acc_stderr": 0.028229497320317213,
"acc_norm": 0.5612903225806452,
"acc_norm_stderr": 0.028229497320317213
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391245,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391245
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6616161616161617,
"acc_stderr": 0.03371124142626302,
"acc_norm": 0.6616161616161617,
"acc_norm_stderr": 0.03371124142626302
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.03292296639155141,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.03292296639155141
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4564102564102564,
"acc_stderr": 0.025254485424799605,
"acc_norm": 0.4564102564102564,
"acc_norm_stderr": 0.025254485424799605
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4495798319327731,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.4495798319327731,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.726605504587156,
"acc_stderr": 0.01910929984609829,
"acc_norm": 0.726605504587156,
"acc_norm_stderr": 0.01910929984609829
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.033321399446680854,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.033321399446680854
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842534,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842534
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.57847533632287,
"acc_stderr": 0.03314190222110657,
"acc_norm": 0.57847533632287,
"acc_norm_stderr": 0.03314190222110657
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0471282125742677,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0471282125742677
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5214723926380368,
"acc_stderr": 0.03924746876751129,
"acc_norm": 0.5214723926380368,
"acc_norm_stderr": 0.03924746876751129
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.782051282051282,
"acc_stderr": 0.027046857630716677,
"acc_norm": 0.782051282051282,
"acc_norm_stderr": 0.027046857630716677
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6883780332056194,
"acc_stderr": 0.016562433867284176,
"acc_norm": 0.6883780332056194,
"acc_norm_stderr": 0.016562433867284176
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.026817718130348923,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.026817718130348923
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2569832402234637,
"acc_stderr": 0.014614465821966318,
"acc_norm": 0.2569832402234637,
"acc_norm_stderr": 0.014614465821966318
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.028304576673141107,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.028304576673141107
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.027604689028581986,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.027604689028581986
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5524691358024691,
"acc_stderr": 0.02766713856942271,
"acc_norm": 0.5524691358024691,
"acc_norm_stderr": 0.02766713856942271
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614105,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614105
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3644067796610169,
"acc_stderr": 0.012291694983056477,
"acc_norm": 0.3644067796610169,
"acc_norm_stderr": 0.012291694983056477
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5,
"acc_stderr": 0.020227834851568375,
"acc_norm": 0.5,
"acc_norm_stderr": 0.020227834851568375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.031067211262872475,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.031067211262872475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014638,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014638
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.038743715565879536,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.038743715565879536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691584,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691584
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.015298077509485081,
"mc2": 0.4072751446221107,
"mc2_stderr": 0.01398560051759931
},
"harness|winogrande|5": {
"acc": 0.7261247040252565,
"acc_stderr": 0.012533292732620296
},
"harness|gsm8k|5": {
"acc": 0.16906747536012132,
"acc_stderr": 0.010324171445497361
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
guangyil/yelp_short_llama | ---
dataset_info:
features:
- name: bert_token
sequence: int64
- name: gpt2_token
sequence: int64
splits:
- name: train
num_bytes: 96353928.56745644
num_examples: 447258
- name: test
num_bytes: 239280.0
num_examples: 1000
download_size: 22101899
dataset_size: 96593208.56745644
---
# Dataset Card for "yelp_short_llama"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MLCommons/ml_spoken_words | ---
annotations_creators:
- machine-generated
language_creators:
- other
language:
- ar
- as
- br
- ca
- cnh
- cs
- cv
- cy
- de
- dv
- el
- en
- eo
- es
- et
- eu
- fa
- fr
- fy
- ga
- gn
- ha
- ia
- id
- it
- ka
- ky
- lt
- lv
- mn
- mt
- nl
- or
- pl
- pt
- rm
- ro
- ru
- rw
- sah
- sk
- sl
- sv
- ta
- tr
- tt
- uk
- vi
- zh
license:
- cc-by-4.0
multilinguality:
- multilingual
size_categories:
- 10M<n<100M
source_datasets:
- extended|common_voice
task_categories:
- audio-classification
task_ids: []
pretty_name: Multilingual Spoken Words
language_bcp47:
- fy-NL
- ga-IE
- rm-sursilv
- rm-vallader
- sv-SE
- zh-CN
tags:
- other-keyword-spotting
---
# Dataset Card for Multilingual Spoken Words
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://mlcommons.org/en/multilingual-spoken-words/
- **Repository:** https://github.com/harvard-edge/multilingual_kws
- **Paper:** https://datasets-benchmarks-proceedings.neurips.cc/paper/2021/file/fe131d7f5a6b38b23cc967316c13dae2-Paper-round2.pdf
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Multilingual Spoken Words Corpus is a large and growing audio dataset of spoken
words in 50 languages collectively spoken by over 5 billion people, for academic
research and commercial applications in keyword spotting and spoken term search,
licensed under CC-BY 4.0. The dataset contains more than 340,000 keywords,
totaling 23.4 million 1-second spoken examples (over 6,000 hours). The dataset
has many use cases, ranging from voice-enabled consumer devices to call center
automation. This dataset is generated by applying forced alignment on crowd-sourced sentence-level
audio to produce per-word timing estimates for extraction.
All alignments are included in the dataset.
Data is provided in two formats: `wav` (16KHz) and `opus` (48KHz). Default configurations look like
`"{lang}_{format}"`, so to load, for example, Tatar in wav format do:
```python
ds = load_dataset("MLCommons/ml_spoken_words", "tt_wav")
```
To download multiple languages in a single dataset pass list of languages to `languages` argument:
```python
ds = load_dataset("MLCommons/ml_spoken_words", languages=["ar", "tt", "br"])
```
To download a specific format pass it to the `format` argument (default format is `wav`):
```python
ds = load_dataset("MLCommons/ml_spoken_words", languages=["ar", "tt", "br"], format="opus")
```
Note that each time you provide different sets of languages,
examples are generated from scratch even if you already provided one or several of them before
because custom configurations are created each time (the data is **not** redownloaded though).
### Supported Tasks and Leaderboards
Keyword spotting, Spoken term search
### Languages
The dataset is multilingual. To specify several languages to download pass a list of them to the
`languages` argument:
```python
ds = load_dataset("MLCommons/ml_spoken_words", languages=["ar", "tt", "br"])
```
The dataset contains data for the following languages:
Low-resourced (<10 hours):
* Arabic (0.1G, 7.6h)
* Assamese (0.9M, 0.1h)
* Breton (69M, 5.6h)
* Chuvash (28M, 2.1h)
* Chinese (zh-CN) (42M, 3.1h)
* Dhivehi (0.7M, 0.04h)
* Frisian (0.1G, 9.6h)
* Georgian (20M, 1.4h)
* Guarani (0.7M, 1.3h)
* Greek (84M, 6.7h)
* Hakha Chin (26M, 0.1h)
* Hausa (90M, 1.0h)
* Interlingua (58M, 4.0h)
* Irish (38M, 3.2h)
* Latvian (51M, 4.2h)
* Lithuanian (21M, 0.46h)
* Maltese (88M, 7.3h)
* Oriya (0.7M, 0.1h)
* Romanian (59M, 4.5h)
* Sakha (42M, 3.3h)
* Slovenian (43M, 3.0h)
* Slovak (31M, 1.9h)
* Sursilvan (61M, 4.8h)
* Tamil (8.8M, 0.6h)
* Vallader (14M, 1.2h)
* Vietnamese (1.2M, 0.1h)
Medium-resourced (>10 & <100 hours):
* Czech (0.3G, 24h)
* Dutch (0.8G, 70h)
* Estonian (0.2G, 19h)
* Esperanto (1.3G, 77h)
* Indonesian (0.1G, 11h)
* Kyrgyz (0.1G, 12h)
* Mongolian (0.1G, 12h)
* Portuguese (0.7G, 58h)
* Swedish (0.1G, 12h)
* Tatar (4G, 30h)
* Turkish (1.3G, 29h)
* Ukrainian (0.2G, 18h)
Hig-resourced (>100 hours):
* Basque (1.7G, 118h)
* Catalan (8.7G, 615h)
* English (26G, 1957h)
* French (9.3G, 754h)
* German (14G, 1083h)
* Italian (2.2G, 155h)
* Kinyarwanda (6.1G, 422h)
* Persian (4.5G, 327h)
* Polish (1.8G, 130h)
* Russian (2.1G, 137h)
* Spanish (4.9G, 349h)
* Welsh (4.5G, 108h)
## Dataset Structure
### Data Instances
```python
{'file': 'абзар_common_voice_tt_17737010.opus',
'is_valid': True,
'language': 0,
'speaker_id': '687025afd5ce033048472754c8d2cb1cf8a617e469866bbdb3746e2bb2194202094a715906f91feb1c546893a5d835347f4869e7def2e360ace6616fb4340e38',
'gender': 0,
'keyword': 'абзар',
'audio': {'path': 'абзар_common_voice_tt_17737010.opus',
'array': array([2.03458695e-34, 2.03458695e-34, 2.03458695e-34, ...,
2.03458695e-34, 2.03458695e-34, 2.03458695e-34]),
'sampling_rate': 48000}}
```
### Data Fields
* file: strinrelative audio path inside the archive
* is_valid: if a sample is valid
* language: language of an instance. Makes sense only when providing multiple languages to the
dataset loader (for example, `load_dataset("ml_spoken_words", languages=["ar", "tt"])`)
* speaker_id: unique id of a speaker. Can be "NA" if an instance is invalid
* gender: speaker gender. Can be one of `["MALE", "FEMALE", "OTHER", "NAN"]`
* keyword: word spoken in a current sample
* audio: a dictionary containing the relative path to the audio file,
the decoded audio array, and the sampling rate.
Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically
decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of
a large number of audio files might take a significant amount of time.
Thus, it is important to first query the sample index before the "audio" column,
i.e. `dataset[0]["audio"]` should always be preferred over `dataset["audio"][0]`
### Data Splits
The data for each language is splitted into train / validation / test parts.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
The data comes form Common Voice dataset.
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
he dataset consists of people who have donated their voice online.
You agree to not attempt to determine the identity of speakers.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
The dataset is licensed under [CC-BY 4.0](https://creativecommons.org/licenses/by/4.0/) and can be used for academic
research and commercial applications in keyword spotting and spoken term search.
### Citation Information
```
@inproceedings{mazumder2021multilingual,
title={Multilingual Spoken Words Corpus},
author={Mazumder, Mark and Chitlangia, Sharad and Banbury, Colby and Kang, Yiping and Ciro, Juan Manuel and Achorn, Keith and Galvez, Daniel and Sabini, Mark and Mattson, Peter and Kanter, David and others},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021}
}
```
### Contributions
Thanks to [@polinaeterna](https://github.com/polinaeterna) for adding this dataset.
|
open-llm-leaderboard/details_TommyZQ__tmm-1b | ---
pretty_name: Evaluation run of TommyZQ/tmm-1b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TommyZQ/tmm-1b](https://huggingface.co/TommyZQ/tmm-1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TommyZQ__tmm-1b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T14:48:22.525913](https://huggingface.co/datasets/open-llm-leaderboard/details_TommyZQ__tmm-1b/blob/main/results_2024-04-15T14-48-22.525913.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2619110343220089,\n\
\ \"acc_stderr\": 0.0310072602836517,\n \"acc_norm\": 0.2636980725459281,\n\
\ \"acc_norm_stderr\": 0.03177987159903618,\n \"mc1\": 0.2215422276621787,\n\
\ \"mc1_stderr\": 0.014537867601301139,\n \"mc2\": 0.37215966211358026,\n\
\ \"mc2_stderr\": 0.013866797243664805\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.30716723549488056,\n \"acc_stderr\": 0.013481034054980945,\n\
\ \"acc_norm\": 0.33361774744027306,\n \"acc_norm_stderr\": 0.013778687054176536\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4387572196773551,\n\
\ \"acc_stderr\": 0.00495220983185658,\n \"acc_norm\": 0.5846444931288588,\n\
\ \"acc_norm_stderr\": 0.004917761181740154\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \
\ \"acc_stderr\": 0.034554737023254366,\n \"acc_norm\": 0.2,\n \"\
acc_norm_stderr\": 0.034554737023254366\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.03391160934343602,\n\
\ \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.03391160934343602\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.025604233470899098,\n\
\ \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.025604233470899098\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.028504856470514203,\n\
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.028504856470514203\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518752,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518752\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2328042328042328,\n \"acc_stderr\": 0.02176596167215453,\n \"\
acc_norm\": 0.2328042328042328,\n \"acc_norm_stderr\": 0.02176596167215453\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.0393253768039287,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.0393253768039287\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1870967741935484,\n\
\ \"acc_stderr\": 0.02218571009225226,\n \"acc_norm\": 0.1870967741935484,\n\
\ \"acc_norm_stderr\": 0.02218571009225226\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.22167487684729065,\n \"acc_stderr\": 0.029225575892489596,\n\
\ \"acc_norm\": 0.22167487684729065,\n \"acc_norm_stderr\": 0.029225575892489596\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.033464098810559534,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.033464098810559534\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20202020202020202,\n \"acc_stderr\": 0.02860620428922987,\n \"\
acc_norm\": 0.20202020202020202,\n \"acc_norm_stderr\": 0.02860620428922987\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178267,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178267\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.258974358974359,\n \"acc_stderr\": 0.02221110681006166,\n \
\ \"acc_norm\": 0.258974358974359,\n \"acc_norm_stderr\": 0.02221110681006166\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.02738140692786896,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.02738140692786896\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1981651376146789,\n \"acc_stderr\": 0.017090573804217885,\n \"\
acc_norm\": 0.1981651376146789,\n \"acc_norm_stderr\": 0.017090573804217885\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693264,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693264\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728742,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728742\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.18404907975460122,\n \"acc_stderr\": 0.030446777687971757,\n\
\ \"acc_norm\": 0.18404907975460122,\n \"acc_norm_stderr\": 0.030446777687971757\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.1875,\n\
\ \"acc_stderr\": 0.0370468111477387,\n \"acc_norm\": 0.1875,\n \
\ \"acc_norm_stderr\": 0.0370468111477387\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21966794380587484,\n\
\ \"acc_stderr\": 0.014805384478371176,\n \"acc_norm\": 0.21966794380587484,\n\
\ \"acc_norm_stderr\": 0.014805384478371176\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.0248480182638752,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.0248480182638752\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.19753086419753085,\n \"acc_stderr\": 0.022152889927898947,\n\
\ \"acc_norm\": 0.19753086419753085,\n \"acc_norm_stderr\": 0.022152889927898947\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24468085106382978,\n \"acc_stderr\": 0.02564555362226673,\n \
\ \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.02564555362226673\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24902216427640156,\n\
\ \"acc_stderr\": 0.01104489226404077,\n \"acc_norm\": 0.24902216427640156,\n\
\ \"acc_norm_stderr\": 0.01104489226404077\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.029896163033125478,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.029896163033125478\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.22875816993464052,\n \"acc_stderr\": 0.016992723465466254,\n \
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.016992723465466254\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.04069306319721376,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.04069306319721376\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2979591836734694,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.2979591836734694,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n\
\ \"acc_stderr\": 0.034843315926805875,\n \"acc_norm\": 0.27710843373493976,\n\
\ \"acc_norm_stderr\": 0.034843315926805875\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.036155076303109344,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.036155076303109344\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2215422276621787,\n\
\ \"mc1_stderr\": 0.014537867601301139,\n \"mc2\": 0.37215966211358026,\n\
\ \"mc2_stderr\": 0.013866797243664805\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.580110497237569,\n \"acc_stderr\": 0.013870943986310391\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01061410159211524,\n \
\ \"acc_stderr\": 0.0028227133223877035\n }\n}\n```"
repo_url: https://huggingface.co/TommyZQ/tmm-1b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|arc:challenge|25_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|gsm8k|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hellaswag|10_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-48-22.525913.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T14-48-22.525913.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- '**/details_harness|winogrande|5_2024-04-15T14-48-22.525913.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T14-48-22.525913.parquet'
- config_name: results
data_files:
- split: 2024_04_15T14_48_22.525913
path:
- results_2024-04-15T14-48-22.525913.parquet
- split: latest
path:
- results_2024-04-15T14-48-22.525913.parquet
---
# Dataset Card for Evaluation run of TommyZQ/tmm-1b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TommyZQ/tmm-1b](https://huggingface.co/TommyZQ/tmm-1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TommyZQ__tmm-1b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T14:48:22.525913](https://huggingface.co/datasets/open-llm-leaderboard/details_TommyZQ__tmm-1b/blob/main/results_2024-04-15T14-48-22.525913.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2619110343220089,
"acc_stderr": 0.0310072602836517,
"acc_norm": 0.2636980725459281,
"acc_norm_stderr": 0.03177987159903618,
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301139,
"mc2": 0.37215966211358026,
"mc2_stderr": 0.013866797243664805
},
"harness|arc:challenge|25": {
"acc": 0.30716723549488056,
"acc_stderr": 0.013481034054980945,
"acc_norm": 0.33361774744027306,
"acc_norm_stderr": 0.013778687054176536
},
"harness|hellaswag|10": {
"acc": 0.4387572196773551,
"acc_stderr": 0.00495220983185658,
"acc_norm": 0.5846444931288588,
"acc_norm_stderr": 0.004917761181740154
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.034554737023254366,
"acc_norm": 0.2,
"acc_norm_stderr": 0.034554737023254366
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.03391160934343602,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.03391160934343602
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.025604233470899098,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.025604233470899098
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.028504856470514203,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.028504856470514203
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518752,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518752
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2328042328042328,
"acc_stderr": 0.02176596167215453,
"acc_norm": 0.2328042328042328,
"acc_norm_stderr": 0.02176596167215453
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.0393253768039287,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.0393253768039287
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1870967741935484,
"acc_stderr": 0.02218571009225226,
"acc_norm": 0.1870967741935484,
"acc_norm_stderr": 0.02218571009225226
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22167487684729065,
"acc_stderr": 0.029225575892489596,
"acc_norm": 0.22167487684729065,
"acc_norm_stderr": 0.029225575892489596
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.033464098810559534,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.033464098810559534
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20202020202020202,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.20202020202020202,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178267,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178267
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.258974358974359,
"acc_stderr": 0.02221110681006166,
"acc_norm": 0.258974358974359,
"acc_norm_stderr": 0.02221110681006166
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.02738140692786896,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.02738140692786896
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1981651376146789,
"acc_stderr": 0.017090573804217885,
"acc_norm": 0.1981651376146789,
"acc_norm_stderr": 0.017090573804217885
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.18404907975460122,
"acc_stderr": 0.030446777687971757,
"acc_norm": 0.18404907975460122,
"acc_norm_stderr": 0.030446777687971757
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.1875,
"acc_stderr": 0.0370468111477387,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.0370468111477387
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.21966794380587484,
"acc_stderr": 0.014805384478371176,
"acc_norm": 0.21966794380587484,
"acc_norm_stderr": 0.014805384478371176
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.19753086419753085,
"acc_stderr": 0.022152889927898947,
"acc_norm": 0.19753086419753085,
"acc_norm_stderr": 0.022152889927898947
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24468085106382978,
"acc_stderr": 0.02564555362226673,
"acc_norm": 0.24468085106382978,
"acc_norm_stderr": 0.02564555362226673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24902216427640156,
"acc_stderr": 0.01104489226404077,
"acc_norm": 0.24902216427640156,
"acc_norm_stderr": 0.01104489226404077
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.029896163033125478,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.029896163033125478
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.016992723465466254,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.016992723465466254
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721376,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721376
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2979591836734694,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.2979591836734694,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.034843315926805875,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.034843315926805875
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.036155076303109344,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.036155076303109344
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301139,
"mc2": 0.37215966211358026,
"mc2_stderr": 0.013866797243664805
},
"harness|winogrande|5": {
"acc": 0.580110497237569,
"acc_stderr": 0.013870943986310391
},
"harness|gsm8k|5": {
"acc": 0.01061410159211524,
"acc_stderr": 0.0028227133223877035
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lmms-lab/textvqa | ---
dataset_info:
features:
- name: image_id
dtype: string
- name: question_id
dtype: int32
- name: question
dtype: string
- name: question_tokens
sequence: string
- name: image
dtype: image
- name: image_width
dtype: int32
- name: image_height
dtype: int32
- name: flickr_original_url
dtype: string
- name: flickr_300k_url
dtype: string
- name: answers
sequence: string
- name: image_classes
sequence: string
- name: set_name
dtype: string
- name: ocr_tokens
sequence: string
splits:
- name: train
num_bytes: 9839776032.652
num_examples: 34602
- name: validation
num_bytes: 1438831837.0
num_examples: 5000
- name: test
num_bytes: 1712000724.844
num_examples: 5734
download_size: 8097805782
dataset_size: 12990608594.496
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
<p align="center" width="100%">
<img src="https://i.postimg.cc/g0QRgMVv/WX20240228-113337-2x.png" width="100%" height="80%">
</p>
# Large-scale Multi-modality Models Evaluation Suite
> Accelerating the development of large-scale multi-modality models (LMMs) with `lmms-eval`
🏠 [Homepage](https://lmms-lab.github.io/) | 📚 [Documentation](docs/README.md) | 🤗 [Huggingface Datasets](https://huggingface.co/lmms-lab)
# This Dataset
This is a formatted version of [TextVQA](https://textvqa.org/). It is used in our `lmms-eval` pipeline to allow for one-click evaluations of large multi-modality models.
```
@inproceedings{singh2019towards,
title={Towards vqa models that can read},
author={Singh, Amanpreet and Natarajan, Vivek and Shah, Meet and Jiang, Yu and Chen, Xinlei and Batra, Dhruv and Parikh, Devi and Rohrbach, Marcus},
booktitle={Proceedings of the IEEE/CVF conference on computer vision and pattern recognition},
pages={8317--8326},
year={2019}
}
```
|
yilunzhao/robut | ---
license: mit
---
## RobuT Dataset
A benchmark based on existing Table QA datasets (WTQ, WikiSQL-Weak, and SQA) and includes human-annotated adversarial perturbations in terms of table header, table content, and question.
## Code
Please refer to our [github repo](https://github.com/yilunzhao/RobuT) for code implementation.
## Contact
For any issues or questions, kindly email us at: Yilun Zhao (yilun.zhao@yale.edu).
## Citation
```
@inproceedings{zhao-etal-2023-robut,
title = "{R}obu{T}: A Systematic Study of Table {QA} Robustness Against Human-Annotated Adversarial Perturbations",
author = "Zhao, Yilun and
Zhao, Chen and
Nan, Linyong and
Qi, Zhenting and
Zhang, Wenlin and
Tang, Xiangru and
Mi, Boyu and
Radev, Dragomir",
booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.acl-long.334",
doi = "10.18653/v1/2023.acl-long.334",
pages = "6064--6081",
}
``` |
open-llm-leaderboard/details_PY007__TinyLlama-1.1B-step-50K-105b | ---
pretty_name: Evaluation run of PY007/TinyLlama-1.1B-step-50K-105b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PY007/TinyLlama-1.1B-step-50K-105b](https://huggingface.co/PY007/TinyLlama-1.1B-step-50K-105b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PY007__TinyLlama-1.1B-step-50K-105b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T19:38:12.477684](https://huggingface.co/datasets/open-llm-leaderboard/details_PY007__TinyLlama-1.1B-step-50K-105b/blob/main/results_2023-10-24T19-38-12.477684.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n\
\ \"em_stderr\": 0.0003476179896857113,\n \"f1\": 0.039109689597315506,\n\
\ \"f1_stderr\": 0.001116278403063508,\n \"acc\": 0.27455565641618196,\n\
\ \"acc_stderr\": 0.0079998796659362\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857113,\n\
\ \"f1\": 0.039109689597315506,\n \"f1_stderr\": 0.001116278403063508\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \
\ \"acc_stderr\": 0.0020013057209480774\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5438042620363063,\n \"acc_stderr\": 0.013998453610924324\n\
\ }\n}\n```"
repo_url: https://huggingface.co/PY007/TinyLlama-1.1B-step-50K-105b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|arc:challenge|25_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T19_38_12.477684
path:
- '**/details_harness|drop|3_2023-10-24T19-38-12.477684.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T19-38-12.477684.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T19_38_12.477684
path:
- '**/details_harness|gsm8k|5_2023-10-24T19-38-12.477684.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T19-38-12.477684.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hellaswag|10_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T19_38_12.477684
path:
- '**/details_harness|winogrande|5_2023-10-24T19-38-12.477684.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T19-38-12.477684.parquet'
- config_name: results
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- results_2023-09-12T12-30-04.204611.parquet
- split: 2023_10_24T19_38_12.477684
path:
- results_2023-10-24T19-38-12.477684.parquet
- split: latest
path:
- results_2023-10-24T19-38-12.477684.parquet
---
# Dataset Card for Evaluation run of PY007/TinyLlama-1.1B-step-50K-105b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PY007/TinyLlama-1.1B-step-50K-105b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PY007/TinyLlama-1.1B-step-50K-105b](https://huggingface.co/PY007/TinyLlama-1.1B-step-50K-105b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PY007__TinyLlama-1.1B-step-50K-105b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T19:38:12.477684](https://huggingface.co/datasets/open-llm-leaderboard/details_PY007__TinyLlama-1.1B-step-50K-105b/blob/main/results_2023-10-24T19-38-12.477684.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857113,
"f1": 0.039109689597315506,
"f1_stderr": 0.001116278403063508,
"acc": 0.27455565641618196,
"acc_stderr": 0.0079998796659362
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857113,
"f1": 0.039109689597315506,
"f1_stderr": 0.001116278403063508
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.0020013057209480774
},
"harness|winogrande|5": {
"acc": 0.5438042620363063,
"acc_stderr": 0.013998453610924324
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
awacke1/ICD10-Clinical-Terminology | ---
license: mit
---
ICD10-Clinical-Terminology
pyarrow fast search demonstration for context AI MMoE |
dummybrendan/animals | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 690375299.39
num_examples: 5399
download_size: 696333284
dataset_size: 690375299.39
---
|
jlbaker361/spider-150 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: frame
dtype: int64
splits:
- name: train
num_bytes: 3459464484.0
num_examples: 800
download_size: 3459591737
dataset_size: 3459464484.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
flaviosilva/testevozflavio | ---
license: openrail
---
|
open-llm-leaderboard/details_upstage__llama-30b-instruct | ---
pretty_name: Evaluation run of upstage/llama-30b-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [upstage/llama-30b-instruct](https://huggingface.co/upstage/llama-30b-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_upstage__llama-30b-instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T15:33:08.826830](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__llama-30b-instruct/blob/main/results_2023-09-17T15-33-08.826830.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19924496644295303,\n\
\ \"em_stderr\": 0.004090563786479079,\n \"f1\": 0.2739314177852351,\n\
\ \"f1_stderr\": 0.004108459298679424,\n \"acc\": 0.46317766024223705,\n\
\ \"acc_stderr\": 0.01006349395660694\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.19924496644295303,\n \"em_stderr\": 0.004090563786479079,\n\
\ \"f1\": 0.2739314177852351,\n \"f1_stderr\": 0.004108459298679424\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12130401819560273,\n \
\ \"acc_stderr\": 0.0089928884972756\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938278\n\
\ }\n}\n```"
repo_url: https://huggingface.co/upstage/llama-30b-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T15_33_08.826830
path:
- '**/details_harness|drop|3_2023-09-17T15-33-08.826830.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T15-33-08.826830.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T15_33_08.826830
path:
- '**/details_harness|gsm8k|5_2023-09-17T15-33-08.826830.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T15-33-08.826830.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:33:00.369415.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:33:00.369415.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:33:00.369415.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T15_33_08.826830
path:
- '**/details_harness|winogrande|5_2023-09-17T15-33-08.826830.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T15-33-08.826830.parquet'
- config_name: results
data_files:
- split: 2023_07_19T22_33_00.369415
path:
- results_2023-07-19T22:33:00.369415.parquet
- split: 2023_09_17T15_33_08.826830
path:
- results_2023-09-17T15-33-08.826830.parquet
- split: latest
path:
- results_2023-09-17T15-33-08.826830.parquet
---
# Dataset Card for Evaluation run of upstage/llama-30b-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/upstage/llama-30b-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [upstage/llama-30b-instruct](https://huggingface.co/upstage/llama-30b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_upstage__llama-30b-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T15:33:08.826830](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__llama-30b-instruct/blob/main/results_2023-09-17T15-33-08.826830.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.19924496644295303,
"em_stderr": 0.004090563786479079,
"f1": 0.2739314177852351,
"f1_stderr": 0.004108459298679424,
"acc": 0.46317766024223705,
"acc_stderr": 0.01006349395660694
},
"harness|drop|3": {
"em": 0.19924496644295303,
"em_stderr": 0.004090563786479079,
"f1": 0.2739314177852351,
"f1_stderr": 0.004108459298679424
},
"harness|gsm8k|5": {
"acc": 0.12130401819560273,
"acc_stderr": 0.0089928884972756
},
"harness|winogrande|5": {
"acc": 0.8050513022888713,
"acc_stderr": 0.011134099415938278
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DucHaiten/Aroma-sensei | ---
license: creativeml-openrail-m
---
|
mohammadhossein/SemEvalTask8_SubTaskB | ---
dataset_info:
features:
- name: text
dtype: string
- name: model
dtype: string
- name: source
dtype: string
- name: label
dtype: int64
- name: id
dtype: int64
splits:
- name: train
num_bytes: 151567991
num_examples: 71027
- name: dev
num_bytes: 4814312
num_examples: 3000
download_size: 84773577
dataset_size: 156382303
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
---
|
Hansollll/summarization | ---
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 8923387.2
num_examples: 4000
- name: test
num_bytes: 2230846.8
num_examples: 1000
download_size: 7184082
dataset_size: 11154234.0
---
# Dataset Card for "summarization"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wirthual/dip-bundestag-qa | ---
task_categories:
- question-answering
language:
- de
tags:
- government
- bundestag
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:** Quelle: https://dip.bundestag.de/
### Dataset Summary
Extracted the questions and answers from the DIP Service. All PDFs which are used are of type "Antwort".
In the current version the documents are between the following dates:
START_DATE = "2015-05-07"
END_DATE = "2023-07-09"
### Languages
German
## Dataset Structure
Each row of the dataset consists of the following fields: question, answer and document id.
The document id can be used to retrieve the meta data for the underlying PDF file by sending a request to the follwoing endpoint:
https://search.dip.bundestag.de/api/v1/swagger-ui/#/Drucksachen/getDrucksache
### Data Fields
Question
Answer
doc_id
### Data Splits
No split
## Dataset Creation
Download PDF, extract text, Classify entries based on font size, [dehyphenize](https://github.com/pd3f/dehyphen) text, build pairs when possible.
## Dataset Curation
At this point, no complex curation of the dataset was performed.
Answers which simply referred to other answers were filtered out by these regexes:
```
'^Auf die Antwort.*verwiesen.$'
'^Es wird auf die Antwort.*verwiesen.$'
```
#### Who are the source language producers?
https://dip.bundestag.de/
### Licensing Information
Quelle: Deutscher Bundestag/Bundesrat – DIP / "Bundestags-Drucksache"
For further detail see:
https://dip.bundestag.de/documents/nutzungsbedingungen_dip.pdf
|
jahb57/glue_subset_1_sentences | ---
dataset_info:
features:
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 111335869
num_examples: 1444677
download_size: 81465370
dataset_size: 111335869
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joaogabrielcasanova/kauan | ---
license: openrail
---
|
open-llm-leaderboard/details_TeeZee__BigMaid-20B-v1.0 | ---
pretty_name: Evaluation run of TeeZee/BigMaid-20B-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TeeZee/BigMaid-20B-v1.0](https://huggingface.co/TeeZee/BigMaid-20B-v1.0) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TeeZee__BigMaid-20B-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T02:04:35.386347](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__BigMaid-20B-v1.0/blob/main/results_2024-02-10T02-04-35.386347.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5671699924361193,\n\
\ \"acc_stderr\": 0.033413899757966437,\n \"acc_norm\": 0.5769999146123668,\n\
\ \"acc_norm_stderr\": 0.03425212068071861,\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.552912820919783,\n\
\ \"mc2_stderr\": 0.01603443649463845\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5853242320819113,\n \"acc_stderr\": 0.014397070564409174,\n\
\ \"acc_norm\": 0.613481228668942,\n \"acc_norm_stderr\": 0.01423008476191048\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6617207727544314,\n\
\ \"acc_stderr\": 0.004721571443354415,\n \"acc_norm\": 0.8526190001991635,\n\
\ \"acc_norm_stderr\": 0.0035376085010691773\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.335978835978836,\n \"acc_stderr\": 0.02432631052914914,\n \"acc_norm\"\
: 0.335978835978836,\n \"acc_norm_stderr\": 0.02432631052914914\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6870967741935484,\n\
\ \"acc_stderr\": 0.02637756702864586,\n \"acc_norm\": 0.6870967741935484,\n\
\ \"acc_norm_stderr\": 0.02637756702864586\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5641025641025641,\n \"acc_stderr\": 0.025141801511177498,\n\
\ \"acc_norm\": 0.5641025641025641,\n \"acc_norm_stderr\": 0.025141801511177498\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.0316314580755238,\n\
\ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.0316314580755238\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7541284403669725,\n \"acc_stderr\": 0.018461940968708436,\n \"\
acc_norm\": 0.7541284403669725,\n \"acc_norm_stderr\": 0.018461940968708436\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.033723432716530645,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.033723432716530645\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251742,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251742\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"\
acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890488,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890488\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7522349936143039,\n\
\ \"acc_stderr\": 0.01543808308056897,\n \"acc_norm\": 0.7522349936143039,\n\
\ \"acc_norm_stderr\": 0.01543808308056897\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n\
\ \"acc_stderr\": 0.016104833880142284,\n \"acc_norm\": 0.3653631284916201,\n\
\ \"acc_norm_stderr\": 0.016104833880142284\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.027316847674192707,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.027316847674192707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.026041766202717163,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.026041766202717163\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4348109517601043,\n\
\ \"acc_stderr\": 0.012661233805616299,\n \"acc_norm\": 0.4348109517601043,\n\
\ \"acc_norm_stderr\": 0.012661233805616299\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.03010563657001663,\n\
\ \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.03010563657001663\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n\
\ \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.029929415408348387,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.029929415408348387\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.552912820919783,\n\
\ \"mc2_stderr\": 0.01603443649463845\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.012121402942855575\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02047005307050796,\n \
\ \"acc_stderr\": 0.0039004133859157153\n }\n}\n```"
repo_url: https://huggingface.co/TeeZee/BigMaid-20B-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|arc:challenge|25_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|gsm8k|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hellaswag|10_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T02-04-35.386347.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T02-04-35.386347.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- '**/details_harness|winogrande|5_2024-02-10T02-04-35.386347.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T02-04-35.386347.parquet'
- config_name: results
data_files:
- split: 2024_02_10T02_04_35.386347
path:
- results_2024-02-10T02-04-35.386347.parquet
- split: latest
path:
- results_2024-02-10T02-04-35.386347.parquet
---
# Dataset Card for Evaluation run of TeeZee/BigMaid-20B-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TeeZee/BigMaid-20B-v1.0](https://huggingface.co/TeeZee/BigMaid-20B-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TeeZee__BigMaid-20B-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T02:04:35.386347](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__BigMaid-20B-v1.0/blob/main/results_2024-02-10T02-04-35.386347.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5671699924361193,
"acc_stderr": 0.033413899757966437,
"acc_norm": 0.5769999146123668,
"acc_norm_stderr": 0.03425212068071861,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.552912820919783,
"mc2_stderr": 0.01603443649463845
},
"harness|arc:challenge|25": {
"acc": 0.5853242320819113,
"acc_stderr": 0.014397070564409174,
"acc_norm": 0.613481228668942,
"acc_norm_stderr": 0.01423008476191048
},
"harness|hellaswag|10": {
"acc": 0.6617207727544314,
"acc_stderr": 0.004721571443354415,
"acc_norm": 0.8526190001991635,
"acc_norm_stderr": 0.0035376085010691773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.02432631052914914,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.02432631052914914
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6870967741935484,
"acc_stderr": 0.02637756702864586,
"acc_norm": 0.6870967741935484,
"acc_norm_stderr": 0.02637756702864586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5641025641025641,
"acc_stderr": 0.025141801511177498,
"acc_norm": 0.5641025641025641,
"acc_norm_stderr": 0.025141801511177498
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228402,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228402
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.0316314580755238,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.0316314580755238
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7541284403669725,
"acc_stderr": 0.018461940968708436,
"acc_norm": 0.7541284403669725,
"acc_norm_stderr": 0.018461940968708436
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.033723432716530645,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.033723432716530645
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251742,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251742
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890488,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890488
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7522349936143039,
"acc_stderr": 0.01543808308056897,
"acc_norm": 0.7522349936143039,
"acc_norm_stderr": 0.01543808308056897
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.016104833880142284,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.016104833880142284
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.027316847674192707,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.027316847674192707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.026041766202717163,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.026041766202717163
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4348109517601043,
"acc_stderr": 0.012661233805616299,
"acc_norm": 0.4348109517601043,
"acc_norm_stderr": 0.012661233805616299
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5661764705882353,
"acc_stderr": 0.03010563657001663,
"acc_norm": 0.5661764705882353,
"acc_norm_stderr": 0.03010563657001663
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.019722058939618068,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.019722058939618068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.029929415408348387,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.029929415408348387
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.552912820919783,
"mc2_stderr": 0.01603443649463845
},
"harness|winogrande|5": {
"acc": 0.7529597474348856,
"acc_stderr": 0.012121402942855575
},
"harness|gsm8k|5": {
"acc": 0.02047005307050796,
"acc_stderr": 0.0039004133859157153
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
edangx100/celeb-identities | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': bradley_cooper
'1': chris_pratt
'2': dave_bautista
'3': djimon
'4': karen_gillan
'5': zoe_saldana
splits:
- name: train
num_bytes: 8292287.0
num_examples: 24
download_size: 8260548
dataset_size: 8292287.0
---
# Dataset Card for "celeb-identities"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
keras-dreambooth/marvin_paranoid_android | ---
license: apache-2.0
tags:
- dreambooth
pretty_name: Marvin the Paranoid Android
size_categories:
- n<1K
---
This dataset contains 15 images of Marvin, the paranoid android from the movie "The Hitchhiker's Guide to the Galaxy" (2005) scraped from the Internet and 205 images of general robots, created with Stable Diffusion from the prompt "a photo of a robot". |
open-llm-leaderboard/details_pansophic__rocket-3B | ---
pretty_name: Evaluation run of pansophic/rocket-3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pansophic/rocket-3B](https://huggingface.co/pansophic/rocket-3B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pansophic__rocket-3B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T18:54:49.996831](https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__rocket-3B/blob/main/results_2024-02-29T18-54-49.996831.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47477902512156295,\n\
\ \"acc_stderr\": 0.03464180645885471,\n \"acc_norm\": 0.476642281206908,\n\
\ \"acc_norm_stderr\": 0.03535757378171989,\n \"mc1\": 0.3843329253365973,\n\
\ \"mc1_stderr\": 0.017028707301245206,\n \"mc2\": 0.5581803060616521,\n\
\ \"mc2_stderr\": 0.01597966786555303\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47952218430034127,\n \"acc_stderr\": 0.014599131353035009,\n\
\ \"acc_norm\": 0.5059726962457338,\n \"acc_norm_stderr\": 0.014610348300255793\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5887273451503684,\n\
\ \"acc_stderr\": 0.004910588449330022,\n \"acc_norm\": 0.7668791077474607,\n\
\ \"acc_norm_stderr\": 0.004219544466789609\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5207547169811321,\n \"acc_stderr\": 0.03074634997572347,\n\
\ \"acc_norm\": 0.5207547169811321,\n \"acc_norm_stderr\": 0.03074634997572347\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n\
\ \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.4861111111111111,\n\
\ \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n\
\ \"acc_stderr\": 0.03784271932887467,\n \"acc_norm\": 0.4393063583815029,\n\
\ \"acc_norm_stderr\": 0.03784271932887467\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047732,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047732\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127154,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127154\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5161290322580645,\n\
\ \"acc_stderr\": 0.028429203176724555,\n \"acc_norm\": 0.5161290322580645,\n\
\ \"acc_norm_stderr\": 0.028429203176724555\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n\
\ \"acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5757575757575758,\n \"acc_stderr\": 0.03859268142070264,\n\
\ \"acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03859268142070264\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5202020202020202,\n \"acc_stderr\": 0.035594435655639176,\n \"\
acc_norm\": 0.5202020202020202,\n \"acc_norm_stderr\": 0.035594435655639176\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414357,\n\
\ \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414357\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.41025641025641024,\n \"acc_stderr\": 0.02493931390694077,\n\
\ \"acc_norm\": 0.41025641025641024,\n \"acc_norm_stderr\": 0.02493931390694077\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766104,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766104\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.40336134453781514,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.40336134453781514,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6477064220183486,\n \"acc_stderr\": 0.020480568843998986,\n \"\
acc_norm\": 0.6477064220183486,\n \"acc_norm_stderr\": 0.020480568843998986\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.032568505702936484,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.032568505702936484\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5980392156862745,\n \"acc_stderr\": 0.034411900234824655,\n \"\
acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.034411900234824655\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.679324894514768,\n \"acc_stderr\": 0.03038193194999041,\n \
\ \"acc_norm\": 0.679324894514768,\n \"acc_norm_stderr\": 0.03038193194999041\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n\
\ \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.5515695067264574,\n\
\ \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.04384140024078016,\n\
\ \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.04384140024078016\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5214723926380368,\n \"acc_stderr\": 0.03924746876751129,\n\
\ \"acc_norm\": 0.5214723926380368,\n \"acc_norm_stderr\": 0.03924746876751129\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6965811965811965,\n\
\ \"acc_stderr\": 0.030118210106942652,\n \"acc_norm\": 0.6965811965811965,\n\
\ \"acc_norm_stderr\": 0.030118210106942652\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5964240102171137,\n\
\ \"acc_stderr\": 0.01754433223792642,\n \"acc_norm\": 0.5964240102171137,\n\
\ \"acc_norm_stderr\": 0.01754433223792642\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4682080924855491,\n \"acc_stderr\": 0.026864624366756656,\n\
\ \"acc_norm\": 0.4682080924855491,\n \"acc_norm_stderr\": 0.026864624366756656\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n\
\ \"acc_stderr\": 0.014756906483260657,\n \"acc_norm\": 0.264804469273743,\n\
\ \"acc_norm_stderr\": 0.014756906483260657\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.028580341065138296,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.028580341065138296\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.48231511254019294,\n\
\ \"acc_stderr\": 0.02838032284907713,\n \"acc_norm\": 0.48231511254019294,\n\
\ \"acc_norm_stderr\": 0.02838032284907713\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5154320987654321,\n \"acc_stderr\": 0.027807490044276184,\n\
\ \"acc_norm\": 0.5154320987654321,\n \"acc_norm_stderr\": 0.027807490044276184\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3475177304964539,\n \"acc_stderr\": 0.028406627809590954,\n \
\ \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.028406627809590954\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36114732724902215,\n\
\ \"acc_stderr\": 0.012267935477519034,\n \"acc_norm\": 0.36114732724902215,\n\
\ \"acc_norm_stderr\": 0.012267935477519034\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.029520095697687765,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.029520095697687765\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4444444444444444,\n \"acc_stderr\": 0.020102583895887184,\n \
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.020102583895887184\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5061224489795918,\n \"acc_stderr\": 0.03200682020163907,\n\
\ \"acc_norm\": 0.5061224489795918,\n \"acc_norm_stderr\": 0.03200682020163907\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6616915422885572,\n\
\ \"acc_stderr\": 0.033455630703391914,\n \"acc_norm\": 0.6616915422885572,\n\
\ \"acc_norm_stderr\": 0.033455630703391914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.036996580176568775,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.036996580176568775\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3843329253365973,\n\
\ \"mc1_stderr\": 0.017028707301245206,\n \"mc2\": 0.5581803060616521,\n\
\ \"mc2_stderr\": 0.01597966786555303\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6795580110497238,\n \"acc_stderr\": 0.01311508545768171\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3646702047005307,\n \
\ \"acc_stderr\": 0.013258428375662245\n }\n}\n```"
repo_url: https://huggingface.co/pansophic/rocket-3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|arc:challenge|25_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|gsm8k|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hellaswag|10_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-54-49.996831.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T18-54-49.996831.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- '**/details_harness|winogrande|5_2024-02-29T18-54-49.996831.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T18-54-49.996831.parquet'
- config_name: results
data_files:
- split: 2024_02_29T18_54_49.996831
path:
- results_2024-02-29T18-54-49.996831.parquet
- split: latest
path:
- results_2024-02-29T18-54-49.996831.parquet
---
# Dataset Card for Evaluation run of pansophic/rocket-3B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [pansophic/rocket-3B](https://huggingface.co/pansophic/rocket-3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pansophic__rocket-3B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T18:54:49.996831](https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__rocket-3B/blob/main/results_2024-02-29T18-54-49.996831.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47477902512156295,
"acc_stderr": 0.03464180645885471,
"acc_norm": 0.476642281206908,
"acc_norm_stderr": 0.03535757378171989,
"mc1": 0.3843329253365973,
"mc1_stderr": 0.017028707301245206,
"mc2": 0.5581803060616521,
"mc2_stderr": 0.01597966786555303
},
"harness|arc:challenge|25": {
"acc": 0.47952218430034127,
"acc_stderr": 0.014599131353035009,
"acc_norm": 0.5059726962457338,
"acc_norm_stderr": 0.014610348300255793
},
"harness|hellaswag|10": {
"acc": 0.5887273451503684,
"acc_stderr": 0.004910588449330022,
"acc_norm": 0.7668791077474607,
"acc_norm_stderr": 0.004219544466789609
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.042561937679014075,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.042561937679014075
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5207547169811321,
"acc_stderr": 0.03074634997572347,
"acc_norm": 0.5207547169811321,
"acc_norm_stderr": 0.03074634997572347
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.04179596617581,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.04179596617581
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.03784271932887467,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.03784271932887467
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39148936170212767,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.39148936170212767,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127154,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127154
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5161290322580645,
"acc_stderr": 0.028429203176724555,
"acc_norm": 0.5161290322580645,
"acc_norm_stderr": 0.028429203176724555
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03859268142070264,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03859268142070264
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5202020202020202,
"acc_stderr": 0.035594435655639176,
"acc_norm": 0.5202020202020202,
"acc_norm_stderr": 0.035594435655639176
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.03447478286414357,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.03447478286414357
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.41025641025641024,
"acc_stderr": 0.02493931390694077,
"acc_norm": 0.41025641025641024,
"acc_norm_stderr": 0.02493931390694077
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.025928876132766104,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.025928876132766104
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40336134453781514,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.40336134453781514,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6477064220183486,
"acc_stderr": 0.020480568843998986,
"acc_norm": 0.6477064220183486,
"acc_norm_stderr": 0.020480568843998986
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.032568505702936484,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.032568505702936484
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.034411900234824655,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.034411900234824655
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.679324894514768,
"acc_stderr": 0.03038193194999041,
"acc_norm": 0.679324894514768,
"acc_norm_stderr": 0.03038193194999041
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5515695067264574,
"acc_stderr": 0.03337883736255098,
"acc_norm": 0.5515695067264574,
"acc_norm_stderr": 0.03337883736255098
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5114503816793893,
"acc_stderr": 0.04384140024078016,
"acc_norm": 0.5114503816793893,
"acc_norm_stderr": 0.04384140024078016
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5214723926380368,
"acc_stderr": 0.03924746876751129,
"acc_norm": 0.5214723926380368,
"acc_norm_stderr": 0.03924746876751129
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6965811965811965,
"acc_stderr": 0.030118210106942652,
"acc_norm": 0.6965811965811965,
"acc_norm_stderr": 0.030118210106942652
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5964240102171137,
"acc_stderr": 0.01754433223792642,
"acc_norm": 0.5964240102171137,
"acc_norm_stderr": 0.01754433223792642
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.026864624366756656,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.026864624366756656
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.014756906483260657,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.014756906483260657
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.028580341065138296,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.028580341065138296
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.48231511254019294,
"acc_stderr": 0.02838032284907713,
"acc_norm": 0.48231511254019294,
"acc_norm_stderr": 0.02838032284907713
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5154320987654321,
"acc_stderr": 0.027807490044276184,
"acc_norm": 0.5154320987654321,
"acc_norm_stderr": 0.027807490044276184
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3475177304964539,
"acc_stderr": 0.028406627809590954,
"acc_norm": 0.3475177304964539,
"acc_norm_stderr": 0.028406627809590954
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36114732724902215,
"acc_stderr": 0.012267935477519034,
"acc_norm": 0.36114732724902215,
"acc_norm_stderr": 0.012267935477519034
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.029520095697687765,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.029520095697687765
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.020102583895887184,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.020102583895887184
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5061224489795918,
"acc_stderr": 0.03200682020163907,
"acc_norm": 0.5061224489795918,
"acc_norm_stderr": 0.03200682020163907
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6616915422885572,
"acc_stderr": 0.033455630703391914,
"acc_norm": 0.6616915422885572,
"acc_norm_stderr": 0.033455630703391914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.036996580176568775,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.036996580176568775
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3843329253365973,
"mc1_stderr": 0.017028707301245206,
"mc2": 0.5581803060616521,
"mc2_stderr": 0.01597966786555303
},
"harness|winogrande|5": {
"acc": 0.6795580110497238,
"acc_stderr": 0.01311508545768171
},
"harness|gsm8k|5": {
"acc": 0.3646702047005307,
"acc_stderr": 0.013258428375662245
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DeepFoldProtein/SCOP-1.65-New-Clu90 | ---
dataset_info:
features:
- name: pdb_id_chain
dtype: string
- name: domain_ids
dtype: string
- name: domain_boundaries
dtype: string
- name: ndom
dtype: int64
- name: is_dis
dtype: int64
- name: seq
dtype: string
splits:
- name: train
num_bytes: 1890091.3731031602
num_examples: 6245
download_size: 1631897
dataset_size: 1890091.3731031602
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
completelyboofyblitzed/weather2json | ---
license: unknown
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: json_structure
dtype: string
splits:
- name: train
num_bytes: 723312
num_examples: 1044
- name: test
num_bytes: 363202
num_examples: 522
download_size: 113015
dataset_size: 1086514
---
|
unreal-hug/REAL_DATASET_SEG_331 | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 89739150.0
num_examples: 331
download_size: 7303591
dataset_size: 89739150.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/kasumi_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kasumi/霞 (Kantai Collection)
This is the dataset of kasumi/霞 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `grey_hair, long_hair, side_ponytail, brown_eyes, ribbon, hair_ribbon, black_ribbon, red_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 574.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasumi_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 325.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasumi_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1262 | 744.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasumi_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 506.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasumi_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1262 | 1.04 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kasumi_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kasumi_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, blush, short_sleeves, solo, arm_warmers, looking_at_viewer, pleated_skirt, school_uniform, white_background, white_shirt, bow, simple_background, suspender_skirt |
| 1 | 10 |  |  |  |  |  | 1girl, pinafore_dress, simple_background, solo, white_background, long_sleeves, looking_at_viewer, white_shirt, blush, school_uniform, belt |
| 2 | 8 |  |  |  |  |  | 1girl, black_dress, long_sleeves, looking_at_viewer, pinafore_dress, solo, white_shirt, simple_background, belt, neck_ribbon, white_background, blush, closed_mouth, buttons, collared_shirt, school_uniform, bangs, upper_body |
| 3 | 5 |  |  |  |  |  | 1girl, kneehighs, long_sleeves, looking_at_viewer, pinafore_dress, school_uniform, solo, white_shirt, black_socks, blush, simple_background, wariza, white_background, belt, open_mouth |
| 4 | 9 |  |  |  |  |  | 1girl, black_dress, enmaided, maid_apron, maid_headdress, solo, white_apron, blush, frilled_apron, looking_at_viewer, closed_mouth, simple_background, holding, long_sleeves, panties, puffy_short_sleeves, white_background |
| 5 | 37 |  |  |  |  |  | 1girl, solo, blush, green_bikini, looking_at_viewer, collarbone, navel, frilled_bikini, simple_background, bikini_skirt, cowboy_shot, white_background, small_breasts, groin, closed_mouth, yellow_eyes, open_mouth |
| 6 | 5 |  |  |  |  |  | 1boy, 1girl, blush, cum_in_pussy, hetero, open_mouth, solo_focus, vaginal, navel, penis, small_breasts, bar_censor, girl_on_top, nipples, spread_legs, bikini_bottom_aside, black_bikini, clothed_female_nude_male, clothed_sex, collarbone, cowgirl_position, heart-shaped_pupils, looking_at_viewer, tears |
| 7 | 5 |  |  |  |  |  | 1girl, floral_print, looking_at_viewer, obi, solo, blush, alternate_costume, long_sleeves, print_kimono, upper_body, wide_sleeves, yellow_eyes, blue_kimono, hair_ornament, open_mouth, smile, yukata |
| 8 | 7 |  |  |  |  |  | 1girl, detached_collar, fake_animal_ears, playboy_bunny, rabbit_ears, solo, strapless_leotard, wrist_cuffs, blush, looking_at_viewer, small_breasts, black_leotard, cowboy_shot, rabbit_tail, simple_background, white_background, bare_legs, dated, open_mouth, red_bowtie, white_leotard |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | short_sleeves | solo | arm_warmers | looking_at_viewer | pleated_skirt | school_uniform | white_background | white_shirt | bow | simple_background | suspender_skirt | pinafore_dress | long_sleeves | belt | black_dress | neck_ribbon | closed_mouth | buttons | collared_shirt | bangs | upper_body | kneehighs | black_socks | wariza | open_mouth | enmaided | maid_apron | maid_headdress | white_apron | frilled_apron | holding | panties | puffy_short_sleeves | green_bikini | collarbone | navel | frilled_bikini | bikini_skirt | cowboy_shot | small_breasts | groin | yellow_eyes | 1boy | cum_in_pussy | hetero | solo_focus | vaginal | penis | bar_censor | girl_on_top | nipples | spread_legs | bikini_bottom_aside | black_bikini | clothed_female_nude_male | clothed_sex | cowgirl_position | heart-shaped_pupils | tears | floral_print | obi | alternate_costume | print_kimono | wide_sleeves | blue_kimono | hair_ornament | smile | yukata | detached_collar | fake_animal_ears | playboy_bunny | rabbit_ears | strapless_leotard | wrist_cuffs | black_leotard | rabbit_tail | bare_legs | dated | red_bowtie | white_leotard |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:----------------|:-------|:--------------|:--------------------|:----------------|:-----------------|:-------------------|:--------------|:------|:--------------------|:------------------|:-----------------|:---------------|:-------|:--------------|:--------------|:---------------|:----------|:-----------------|:--------|:-------------|:------------|:--------------|:---------|:-------------|:-----------|:-------------|:-----------------|:--------------|:----------------|:----------|:----------|:----------------------|:---------------|:-------------|:--------|:-----------------|:---------------|:--------------|:----------------|:--------|:--------------|:-------|:---------------|:---------|:-------------|:----------|:--------|:-------------|:--------------|:----------|:--------------|:----------------------|:---------------|:---------------------------|:--------------|:-------------------|:----------------------|:--------|:---------------|:------|:--------------------|:---------------|:---------------|:--------------|:----------------|:--------|:---------|:------------------|:-------------------|:----------------|:--------------|:--------------------|:--------------|:----------------|:--------------|:------------|:--------|:-------------|:----------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | | X | | X | | X | X | X | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | | X | | X | | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | X | | X | | X | X | X | | X | | X | X | X | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | | X | | X | | | X | | | X | | | X | | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 37 |  |  |  |  |  | X | X | | X | | X | | | X | | | X | | | | | | | X | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | | X | | X | | | | | | | | | X | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | X | | X | | X | | | X | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
CATIE-AQ/squad_v2_french_translated_fr_prompt_question_generation_with_answer | ---
language:
- fr
license: apache-2.0
size_categories:
- 1M<n<10M
task_categories:
- text-generation
tags:
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- squad_v2_french_translated
---
# squad_v2_french_translated_fr_prompt_question_generation_with_answer
## Summary
**squad_v2_french_translated_fr_prompt_question_generation_with_answer** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **1,165,934** rows that can be used for a question-generation (with answer) task.
The original data (without prompts) comes from the dataset [pragnakalp/squad_v2_french_translated](https://huggingface.co/datasets/pragnakalp/squad_v2_french_translated) and was augmented by questions in SQUAD 2.0 format in the [FrenchQA]( https://huggingface.co/datasets/CATIE-AQ/frenchQA) dataset.
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
22 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'Quelle question donnerait la réponse suivante ? Réponse : "'+answer+'";\nQuestion :',
'Déterminer la question qui aurait pu être posée pour obtenir la réponse suivante. \n Réponse : "'+answer+'";\n Question :',
'Détermine la question que tu aurais pu poser pour obtenir la réponse suivante. \n Réponse : "'+answer+'";\n Question :',
'Déterminez la question que vous auriez pu poser pour obtenir la réponse suivante . \n Réponse : "'+answer+'";\n Question :',
'Quelle question aurait pu être posée pour obtenir la réponse suivante. \n Réponse : "'+answer+'";\n Question :',
'Quelle question aurais-tu pu poser pour obtenir la réponse suivante. \n Réponse : "'+answer+'";\n Question :',
'Quelle question auriez-vous pu poser pour obtenir la réponse suivante. \n Réponse : "'+answer+'";\n Question :',
'Quelle question aurait pu être posée pour obtenir la réponse suivante. \n Réponse : "'+answer+'";\n Question :',
'Quelle question aurais-tu pu poser pour obtenir la réponse suivante. \n Réponse : "'+answer+'";\n Question :',
'Quelle question auriez-vous pu poser pour obtenir la réponse suivante. \n Réponse : "'+answer+'";\n Question :',
'Sachant la réponse suivante : "'+answer+'"\n Générer une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Génère une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Générez une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Trouver une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Trouves une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Trouvez une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Créer une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Crée trouver une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Créez trouver une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Ecrire une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Ecris une bonne question : ',
'Sachant la réponse suivante : "'+answer+'"\n Ecrivez une bonne question
```
# Splits
- `train` with 1,165,934 samples
- no `valid` split
- no `test` split
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/squad_v2_french_translated_fr_prompt_question_generation_with_answer")
```
# Citation
## Original data
> Hugging Face repository: https://huggingface.co/datasets/pragnakalp/squad_v2_french_translated
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
apache-2.0 |
Multimodal-Fatima/VQAv2_test_no_image_split_4 | ---
dataset_info:
features:
- name: question_type
dtype: string
- name: multiple_choice_answer
dtype: string
- name: answers
sequence: string
- name: answers_original
list:
- name: answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: id_image
dtype: int64
- name: answer_type
dtype: string
- name: question_id
dtype: int64
- name: question
dtype: string
- name: id
dtype: int64
- name: clip_tags_ViT_L_14
sequence: string
- name: blip_caption
dtype: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes
list:
- name: attribute
dtype: string
- name: box
sequence: float32
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float32
- name: size
dtype: string
- name: tag
dtype: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_L_14_wo_openai
sequence: string
- name: clip_tags_ViT_L_14_with_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_with_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_with_openai
sequence: string
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_LAION_ViT_bigG_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: captions_module
sequence: string
- name: captions_module_filter
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
splits:
- name: test
num_bytes: 2173513371
num_examples: 44779
download_size: 570289348
dataset_size: 2173513371
---
# Dataset Card for "VQAv2_test_no_image_split_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JaehyungKim/p2c_polite_stack | ---
license: other
license_name: following-original-dataset
license_link: LICENSE
---
|
open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Holomax | ---
pretty_name: Evaluation run of KoboldAI/LLaMA2-13B-Holomax
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KoboldAI/LLaMA2-13B-Holomax](https://huggingface.co/KoboldAI/LLaMA2-13B-Holomax)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Holomax\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-19T03:44:46.836868](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Holomax/blob/main/results_2023-10-19T03-44-46.836868.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n\
\ \"em_stderr\": 0.0004566676462666988,\n \"f1\": 0.06074769295302006,\n\
\ \"f1_stderr\": 0.0013672043421452582,\n \"acc\": 0.4305631433729482,\n\
\ \"acc_stderr\": 0.010496955983172063\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.0004566676462666988,\n\
\ \"f1\": 0.06074769295302006,\n \"f1_stderr\": 0.0013672043421452582\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11448066717210008,\n \
\ \"acc_stderr\": 0.008770157532110506\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7466456195737964,\n \"acc_stderr\": 0.012223754434233623\n\
\ }\n}\n```"
repo_url: https://huggingface.co/KoboldAI/LLaMA2-13B-Holomax
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T03_44_46.836868
path:
- '**/details_harness|drop|3_2023-10-19T03-44-46.836868.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T03-44-46.836868.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T03_44_46.836868
path:
- '**/details_harness|gsm8k|5_2023-10-19T03-44-46.836868.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-19T03-44-46.836868.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T03_44_46.836868
path:
- '**/details_harness|winogrande|5_2023-10-19T03-44-46.836868.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T03-44-46.836868.parquet'
- config_name: results
data_files:
- split: 2023_10_19T03_44_46.836868
path:
- results_2023-10-19T03-44-46.836868.parquet
- split: latest
path:
- results_2023-10-19T03-44-46.836868.parquet
---
# Dataset Card for Evaluation run of KoboldAI/LLaMA2-13B-Holomax
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KoboldAI/LLaMA2-13B-Holomax
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KoboldAI/LLaMA2-13B-Holomax](https://huggingface.co/KoboldAI/LLaMA2-13B-Holomax) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Holomax",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T03:44:46.836868](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Holomax/blob/main/results_2023-10-19T03-44-46.836868.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.0004566676462666988,
"f1": 0.06074769295302006,
"f1_stderr": 0.0013672043421452582,
"acc": 0.4305631433729482,
"acc_stderr": 0.010496955983172063
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.0004566676462666988,
"f1": 0.06074769295302006,
"f1_stderr": 0.0013672043421452582
},
"harness|gsm8k|5": {
"acc": 0.11448066717210008,
"acc_stderr": 0.008770157532110506
},
"harness|winogrande|5": {
"acc": 0.7466456195737964,
"acc_stderr": 0.012223754434233623
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
HelloFriend22/test | ---
dataset_info:
features:
- name: response
dtype: string
- name: type
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: answer_0
dtype: string
- name: solution_0
dtype: string
- name: correct_0
dtype: bool
- name: answer_1
dtype: string
- name: solution_1
dtype: string
- name: correct_1
dtype: bool
- name: answer_2
dtype: string
- name: solution_2
dtype: string
- name: correct_2
dtype: bool
- name: answer_3
dtype: string
- name: solution_3
dtype: string
- name: correct_3
dtype: bool
- name: answer_diversity
dtype: int64
splits:
- name: train
num_bytes: 2901035
num_examples: 1000
download_size: 1294129
dataset_size: 2901035
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bn_hate_speech | ---
annotations_creators:
- crowdsourced
- expert-generated
language_creators:
- found
language:
- bn
license:
- mit
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids: []
paperswithcode_id: bengali-hate-speech
pretty_name: Bengali Hate Speech Dataset
tags:
- hate-speech-topic-classification
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': Personal
'1': Political
'2': Religious
'3': Geopolitical
'4': Gender abusive
splits:
- name: train
num_bytes: 972631
num_examples: 3418
download_size: 389814
dataset_size: 972631
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for Bengali Hate Speech Dataset
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Bengali Hate Speech Dataset](https://github.com/rezacsedu/Bengali-Hate-Speech-Dataset)
- **Repository:** [Bengali Hate Speech Dataset](https://github.com/rezacsedu/Bengali-Hate-Speech-Dataset)
- **Paper:** [Classification Benchmarks for Under-resourced Bengali Language based on Multichannel Convolutional-LSTM Network](https://arxiv.org/abs/2004.07807)
- **Point of Contact:** [Md. Rezaul Karim](rezaul.karim.fit@gmail.com)
### Dataset Summary
The Bengali Hate Speech Dataset is a Bengali-language dataset of news articles collected from various Bengali media sources and categorized based on the type of hate in the text. The dataset was created to provide greater support for under-resourced languages like Bengali on NLP tasks, and serves as a benchmark for multiple types of classification tasks.
### Supported Tasks and Leaderboards
* `topic classification`: The dataset can be used to train a Multichannel Convolutional-LSTM for classifying different types of hate speech. The model performance can be measured by its F1 score.
### Languages
The text in the dataset is in Bengali and the associated BCP-47 code is `bn`.
## Dataset Structure
### Data Instances
A data instance takes the form of a news article and its associated label.
🚨 Beware that the following example contains extremely offensive content!
An example looks like this:
```
{"text": "রেন্ডিয়াকে পৃথীবির মানচিএ থেকে মুচে ফেলতে হবে",
"label": "Geopolitical"}
```
### Data Fields
* `text`: the text of the Bengali news article
* `label`: one of `Geopolitical`, `Personal`, `Political`, `Religious`, or `Gender abusive` indicating the type of hate speech
### Data Splits
The dataset has 3418 examples.
## Dataset Creation
### Curation Rationale
Under-resourced languages like Bengali lack supporting resources that languages like English have. This dataset was collected from multiple Bengali news sources to provide several classification benchmarks for hate speech detection, document classification and sentiment analysis.
### Source Data
#### Initial Data Collection and Normalization
Bengali articles were collected from a Bengali Wikipedia dump, Bengali news articles, news dumps of TV channels, books, blogs, sports portal and social media. Emphasis was placed on Facebook pages and newspaper sources because they have about 50 million followers and is a common source of opinion and hate speech. The full dataset consists of 250 million articles and is currently being prepared. This is a subset of the full dataset.
#### Who are the source language producers?
The source language producers are Bengali authors and users who interact with these various forms of Bengali media.
### Annotations
#### Annotation process
The data was annotated by manually identifying freqently occurring terms in texts containing hate speech and references to specific entities. The authors also prepared normalized frequency vectors of 175 abusive terms that are commonly used to express hate in Bengali. A hate label is assigned if at least one of these terms exists in the text. Annotator's were provided with unbiased text only contents to make the decision. Non-hate statements were removed from the list and the category of hate was further divided into political, personal, gender abusive, geopolitical and religious. To reduce possible bias, each label was assigned based on a majority voting on the annotator's opinions and Cohen's Kappa was computed to measure inter-annotator agreement.
#### Who are the annotators?
Three native Bengali speakers and two linguists annotated the dataset which was then reviewed and validated by three experts (one South Asian linguist and two native speakers).
### Personal and Sensitive Information
The dataset contains very sensitive and highly offensive comments in a religious, political and gendered context. Some of the comments are directed towards contemporary public figures like politicians, religious leaders, celebrities and athletes.
## Considerations for Using the Data
### Social Impact of Dataset
The purpose of the dataset is to improve hate speech detection in Bengali. The growth of social media has enabled people to express hate freely online and there has been a lot of focus on detecting hate speech for highly resourced languages like English. The use of hate speech is pervasive, like any other major language, which can have serious and deadly consequences. Failure to react to hate speech renders targeted minorities more vulnerable to attack and it can also create indifference towards their treatment from majority populations.
### Discussion of Biases
The dataset was collected using a bootstrapping approach. An initial search was made for specific types of texts, articles and tweets containing common harassment directed at targeting characteristics. As a result, this dataset contains **extremely** offensive content that is disturbing. In addition, Facebook pages and newspaper sources were emphasized because they are well-known for having hate and harassment issues.
### Other Known Limitations
The dataset contains racist, sexist, homophobic and offensive comments. It is collected and annotated for research related purposes only.
## Additional Information
### Dataset Curators
The dataset was curated by Md. Rezaul Karim, Sumon Kanti Dey, Bharathi Raja Chakravarthi, John McCrae and Michael Cochez.
### Licensing Information
This dataset is licensed under the MIT License.
### Citation Information
```
@inproceedings{karim2020BengaliNLP,
title={Classification Benchmarks for Under-resourced Bengali Language based on Multichannel Convolutional-LSTM Network},
author={Karim, Md. Rezaul and Chakravarti, Bharathi Raja and P. McCrae, John and Cochez, Michael},
booktitle={7th IEEE International Conference on Data Science and Advanced Analytics (IEEE DSAA,2020)},
publisher={IEEE},
year={2020}
}
```
### Contributions
Thanks to [@stevhliu](https://github.com/stevhliu) for adding this dataset. |
Chapian/PPE_detection | ---
task_categories:
- object-detection
tags:
- roboflow
- roboflow2huggingface
---
### Dataset Labels
```
['ad', 'airsweeper', 'bombtower', 'canon', 'clancastle', 'eagle', 'inferno', 'kingpad', 'mortar', 'queenpad', 'rcpad', 'scattershot', 'th13', 'wardenpad', 'wizztower', 'xbow']
```
from datasets import load_dataset
dataset = load_dataset("Chapian/PPE_detection") |
tbilisi-ai-lab/TinyStoriesInstruct | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: 'Features:'
dtype: string
- name: 'Summary:'
dtype: string
- name: 'Random sentence:'
dtype: string
- name: 'Story:'
sequence: string
- name: 'Words:'
dtype: string
splits:
- name: train
num_bytes: 2342624089
num_examples: 2476532
- name: valid
num_bytes: 23620552
num_examples: 25027
download_size: 1276411379
dataset_size: 2366244641
---
|
kuanhuggingface/amazon_tts_encodec | ---
dataset_info:
features:
- name: file_id
dtype: string
- name: instruction
dtype: string
- name: transcription
dtype: string
- name: src_encodec_0
sequence: int64
- name: src_encodec_1
sequence: int64
- name: src_encodec_2
sequence: int64
- name: src_encodec_3
sequence: int64
- name: src_encodec_4
sequence: int64
- name: src_encodec_5
sequence: int64
- name: src_encodec_6
sequence: int64
- name: src_encodec_7
sequence: int64
- name: tgt_encodec_0
sequence: int64
- name: tgt_encodec_1
sequence: int64
- name: tgt_encodec_2
sequence: int64
- name: tgt_encodec_3
sequence: int64
- name: tgt_encodec_4
sequence: int64
- name: tgt_encodec_5
sequence: int64
- name: tgt_encodec_6
sequence: int64
- name: tgt_encodec_7
sequence: int64
splits:
- name: train
num_bytes: 6057391940
num_examples: 171430
- name: validation
num_bytes: 351554634
num_examples: 10000
- name: test
num_bytes: 353040020
num_examples: 10000
download_size: 506194253
dataset_size: 6761986594
---
# Dataset Card for "amazon_tts_encodec"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
determined-ai/customers-complaints | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
- split: test
path: data/test-*
dataset_info:
features:
- name: Product
dtype: string
- name: Sub_product
dtype: string
- name: Consumer_complaint_narrative
dtype: string
splits:
- name: train
num_bytes: 25124940.0
num_examples: 24000
- name: eval
num_bytes: 3140617.5
num_examples: 3000
- name: test
num_bytes: 3140617.5
num_examples: 3000
download_size: 14662968
dataset_size: 31406175.0
---
# Dataset Card for "customers-complaints"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kenhktsui/off-topic | ---
license: apache-2.0
---
|
Kalfrin/edataset | ---
license: openrail
---
|
RayhanADev/replit-comments-categorized | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- found
license:
- mit
multilinguality:
- monolingual
pretty_name: Replit Comments Categorized
size_categories:
- n<1K
source_datasets:
- original
tags:
- replit
- comments
- forum
- chat
- intent
- classification
task_categories:
- text-classification
task_ids:
- intent-classification
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
## Dataset Description
- **Homepage:** https://huggingface.co/datasets/RayhanADev/replit-comments-categorized
- **Repository:** https://huggingface.co/datasets/RayhanADev/replit-comments-categorized
- **Point of Contact:** ray@furret.dev
### Dataset Summary
Comments from [Replit](https://replit.com/)'s Community, sourced via moderator GraphQL queries and personally labeled :). For use in Replit + Weights and Biases Hackathon.
### Supported Tasks and Leaderboards
Text Classification
### Languages
English
## Dataset Structure
### Data Instances
```json
{"label":3,"text":"@KENDALPETERSON\nShut up you dont have a permit to brag."}
```
Labels
- 0: General
- 1: Spam
- 2: NSFW
- 3: Harassment
### Data Fields
Label, Text
### Data Splits
Train, Validation, Test
## Dataset Creation
### Curation Rationale
Fine-tuning data for the Replit + Weights and Biases Hackathon.
### Source Data
#### Initial Data Collection and Normalization
This data was collected via Replit's GraphQL API using a query only available to site moderators and admins, allowing for querying comments by pattern. By not setting a
pattern one could get up to 50 comments within a given time span. Data was sourced from February 1st, 2023 to February 10th, 2023.
#### Who are the source language producers?
Replit Community users
### Annotations
#### Annotation process
I tagged the data using a small website that I made to assign labels to individual comments.

#### Who are the annotators?
Me ([@rayhanadev](https://www.furret.dev))
### Personal and Sensitive Information
This dataset is purely unfiltered and may contain personal and sensitive information.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
The data is categorized according to three categories of content that are not allowed on Replit (and would fall into a fourth "general" category if it is acceptable
content).
### Other Known Limitations
The dataset is small (n<1k) but I am working on increasing the amount of available data. There is also significantly more unacceptable content labelled than there is
acceptable ("general") content.
## Additional Information
### Dataset Curators
Me ([@rayhanadev](https://www.furret.dev))
### Licensing Information
This data is licensed under MIT but Replit or any concerned parties is free to issue a takedown of this dataset. Any person who's data is available in this dataset is free
to request that their data is deleted, however it is not a breach of Replit's data use policy for this data to be available.
|
CVasNLPExperiments/Hatefulmemes_test_google_flan_t5_xxl_mode_CM_T_OCR_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__text
num_bytes: 6098587
num_examples: 1000
download_size: 1134710
dataset_size: 6098587
---
# Dataset Card for "Hatefulmemes_test_google_flan_t5_xxl_mode_CM_T_OCR_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_yam-peleg__Experiment7-7B | ---
pretty_name: Evaluation run of yam-peleg/Experiment7-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yam-peleg/Experiment7-7B](https://huggingface.co/yam-peleg/Experiment7-7B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yam-peleg__Experiment7-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-11T19:25:35.851401](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment7-7B/blob/main/results_2024-02-11T19-25-35.851401.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6567730273281559,\n\
\ \"acc_stderr\": 0.03199263283100071,\n \"acc_norm\": 0.6574973933898954,\n\
\ \"acc_norm_stderr\": 0.03264096338099939,\n \"mc1\": 0.565483476132191,\n\
\ \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7059020061319847,\n\
\ \"mc2_stderr\": 0.01499668880054581\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068744,\n\
\ \"acc_norm\": 0.7184300341296929,\n \"acc_norm_stderr\": 0.013143376735009022\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7126070503883688,\n\
\ \"acc_stderr\": 0.004516215206715352,\n \"acc_norm\": 0.8804023102967536,\n\
\ \"acc_norm_stderr\": 0.003238273295284749\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"\
acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657262,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657262\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\"\
: 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \"\
acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.0134682016140663,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.0134682016140663\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45363128491620114,\n\
\ \"acc_stderr\": 0.016650437588269073,\n \"acc_norm\": 0.45363128491620114,\n\
\ \"acc_norm_stderr\": 0.016650437588269073\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495148,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495148\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.565483476132191,\n\
\ \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7059020061319847,\n\
\ \"mc2_stderr\": 0.01499668880054581\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6474601971190296,\n \
\ \"acc_stderr\": 0.013159909755930324\n }\n}\n```"
repo_url: https://huggingface.co/yam-peleg/Experiment7-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|arc:challenge|25_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|gsm8k|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hellaswag|10_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T19-25-35.851401.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T19-25-35.851401.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- '**/details_harness|winogrande|5_2024-02-11T19-25-35.851401.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-11T19-25-35.851401.parquet'
- config_name: results
data_files:
- split: 2024_02_11T19_25_35.851401
path:
- results_2024-02-11T19-25-35.851401.parquet
- split: latest
path:
- results_2024-02-11T19-25-35.851401.parquet
---
# Dataset Card for Evaluation run of yam-peleg/Experiment7-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yam-peleg/Experiment7-7B](https://huggingface.co/yam-peleg/Experiment7-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yam-peleg__Experiment7-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T19:25:35.851401](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment7-7B/blob/main/results_2024-02-11T19-25-35.851401.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6567730273281559,
"acc_stderr": 0.03199263283100071,
"acc_norm": 0.6574973933898954,
"acc_norm_stderr": 0.03264096338099939,
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.7059020061319847,
"mc2_stderr": 0.01499668880054581
},
"harness|arc:challenge|25": {
"acc": 0.7013651877133106,
"acc_stderr": 0.013374078615068744,
"acc_norm": 0.7184300341296929,
"acc_norm_stderr": 0.013143376735009022
},
"harness|hellaswag|10": {
"acc": 0.7126070503883688,
"acc_stderr": 0.004516215206715352,
"acc_norm": 0.8804023102967536,
"acc_norm_stderr": 0.003238273295284749
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531003,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.02616056824660146,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.02616056824660146
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.0134682016140663,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.0134682016140663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45363128491620114,
"acc_stderr": 0.016650437588269073,
"acc_norm": 0.45363128491620114,
"acc_norm_stderr": 0.016650437588269073
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495148,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495148
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.7059020061319847,
"mc2_stderr": 0.01499668880054581
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
},
"harness|gsm8k|5": {
"acc": 0.6474601971190296,
"acc_stderr": 0.013159909755930324
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
FLAVIONEP/TREINAVOZFLAVIO | ---
license: openrail
---
|
zolak/twitter_dataset_80_1713114929 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 368107
num_examples: 909
download_size: 186836
dataset_size: 368107
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bigscience-data/roots_ca_viquiquad | ---
language: ca
license: cc-by-sa-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_ca_viquiquad
# UIT-ViQuAD – A Vietnamese Dataset for Evaluating Machine Reading Comprehension.
- Dataset uid: `viquiquad`
### Description
Vietnamese Question Answering Dataset (UIT-ViQuAD), a new
dataset for the low-resource language as Vietnamese to evaluate MRC models. This dataset comprises over 23,000 human-generated question-answer pairs based on 5,109 passages of 174 Vietnamese articles from Wikipedia.
### Homepage
https://sites.google.com/uit.edu.vn/uit-nlp/datasets-projects
### Licensing
- open license
- cc-by-nc-sa-4.0: Creative Commons Attribution Non Commercial Share Alike 4.0 International
Creative Commons Attribution 4.0 International License
### Speaker Locations
- South-eastern Asia
- Vietnam
### Sizes
- 0.0001 % of total
- 0.0047 % of ca
### BigScience processing steps
#### Filters applied to: ca
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
|
harry558/harrymodhgill5590 | ---
license: bigscience-openrail-m
---
|
yeboyswag/yourmumjokes | ---
language:
- en
size_categories:
- n<1K
tags:
- joke
- input
--- |
codeparrot/github-code | ---
annotations_creators: []
language_creators:
- crowdsourced
- expert-generated
language:
- code
license:
- other
multilinguality:
- multilingual
pretty_name: github-code
size_categories:
- unknown
source_datasets: []
task_categories:
- text-generation
task_ids:
- language-modeling
---
# GitHub Code Dataset
## Dataset Description
The GitHub Code dataset consists of 115M code files from GitHub in 32 programming languages with 60 extensions totaling in 1TB of data. The dataset was created from the public GitHub dataset on Google BiqQuery.
### How to use it
The GitHub Code dataset is a very large dataset so for most use cases it is recommended to make use of the streaming API of `datasets`. You can load and iterate through the dataset with the following two lines of code:
```python
from datasets import load_dataset
ds = load_dataset("codeparrot/github-code", streaming=True, split="train")
print(next(iter(ds)))
#OUTPUT:
{
'code': "import mod189 from './mod189';\nvar value=mod189+1;\nexport default value;\n",
'repo_name': 'MirekSz/webpack-es6-ts',
'path': 'app/mods/mod190.js',
'language': 'JavaScript',
'license': 'isc',
'size': 73
}
```
You can see that besides the code, repo name, and path also the programming language, license, and the size of the file are part of the dataset. You can also filter the dataset for any subset of the 30 included languages (see the full list below) in the dataset. Just pass the list of languages as a list. E.g. if your dream is to build a Codex model for Dockerfiles use the following configuration:
```python
ds = load_dataset("codeparrot/github-code", streaming=True, split="train", languages=["Dockerfile"])
print(next(iter(ds))["code"])
#OUTPUT:
"""\
FROM rockyluke/ubuntu:precise
ENV DEBIAN_FRONTEND="noninteractive" \
TZ="Europe/Amsterdam"
...
"""
```
We also have access to the license of the origin repo of a file so we can filter for licenses in the same way we filtered for languages:
```python
ds = load_dataset("codeparrot/github-code", streaming=True, split="train", licenses=["mit", "isc"])
licenses = []
for element in iter(ds).take(10_000):
licenses.append(element["license"])
print(Counter(licenses))
#OUTPUT:
Counter({'mit': 9896, 'isc': 104})
```
Naturally, you can also download the full dataset. Note that this will download ~300GB compressed text data and the uncompressed dataset will take up ~1TB of storage:
```python
ds = load_dataset("codeparrot/github-code", split="train")
```
## Data Structure
### Data Instances
```python
{
'code': "import mod189 from './mod189';\nvar value=mod189+1;\nexport default value;\n",
'repo_name': 'MirekSz/webpack-es6-ts',
'path': 'app/mods/mod190.js',
'language': 'JavaScript',
'license': 'isc',
'size': 73
}
```
### Data Fields
|Field|Type|Description|
|---|---|---|
|code|string|content of source file|
|repo_name|string|name of the GitHub repository|
|path|string|path of file in GitHub repository|
|language|string|programming language as inferred by extension|
|license|string|license of GitHub repository|
|size|int|size of source file in bytes|
### Data Splits
The dataset only contains a train split.
## Languages
The dataset contains 30 programming languages with over 60 extensions:
```python
{
"Assembly": [".asm"],
"Batchfile": [".bat", ".cmd"],
"C": [".c", ".h"],
"C#": [".cs"],
"C++": [".cpp", ".hpp", ".c++", ".h++", ".cc", ".hh", ".C", ".H"],
"CMake": [".cmake"],
"CSS": [".css"],
"Dockerfile": [".dockerfile", "Dockerfile"],
"FORTRAN": ['.f90', '.f', '.f03', '.f08', '.f77', '.f95', '.for', '.fpp'],
"GO": [".go"],
"Haskell": [".hs"],
"HTML":[".html"],
"Java": [".java"],
"JavaScript": [".js"],
"Julia": [".jl"],
"Lua": [".lua"],
"Makefile": ["Makefile"],
"Markdown": [".md", ".markdown"],
"PHP": [".php", ".php3", ".php4", ".php5", ".phps", ".phpt"],
"Perl": [".pl", ".pm", ".pod", ".perl"],
"PowerShell": ['.ps1', '.psd1', '.psm1'],
"Python": [".py"],
"Ruby": [".rb"],
"Rust": [".rs"],
"SQL": [".sql"],
"Scala": [".scala"],
"Shell": [".sh", ".bash", ".command", ".zsh"],
"TypeScript": [".ts", ".tsx"],
"TeX": [".tex"],
"Visual Basic": [".vb"]
}
```
## Licenses
Each example is also annotated with the license of the associated repository. There are in total 15 licenses:
```python
[
'mit',
'apache-2.0',
'gpl-3.0',
'gpl-2.0',
'bsd-3-clause',
'agpl-3.0',
'lgpl-3.0',
'lgpl-2.1',
'bsd-2-clause',
'cc0-1.0',
'epl-1.0',
'mpl-2.0',
'unlicense',
'isc',
'artistic-2.0'
]
```
## Dataset Statistics
The dataset contains 115M files and the sum of all the source code file sizes is 873 GB (note that the size of the dataset is larger due to the extra fields). A breakdown per language is given in the plot and table below:

| | Language |File Count| Size (GB)|
|---:|:-------------|---------:|-------:|
| 0 | Java | 19548190 | 107.70 |
| 1 | C | 14143113 | 183.83 |
| 2 | JavaScript | 11839883 | 87.82 |
| 3 | HTML | 11178557 | 118.12 |
| 4 | PHP | 11177610 | 61.41 |
| 5 | Markdown | 8464626 | 23.09 |
| 6 | C++ | 7380520 | 87.73 |
| 7 | Python | 7226626 | 52.03 |
| 8 | C# | 6811652 | 36.83 |
| 9 | Ruby | 4473331 | 10.95 |
| 10 | GO | 2265436 | 19.28 |
| 11 | TypeScript | 1940406 | 24.59 |
| 12 | CSS | 1734406 | 22.67 |
| 13 | Shell | 1385648 | 3.01 |
| 14 | Scala | 835755 | 3.87 |
| 15 | Makefile | 679430 | 2.92 |
| 16 | SQL | 656671 | 5.67 |
| 17 | Lua | 578554 | 2.81 |
| 18 | Perl | 497949 | 4.70 |
| 19 | Dockerfile | 366505 | 0.71 |
| 20 | Haskell | 340623 | 1.85 |
| 21 | Rust | 322431 | 2.68 |
| 22 | TeX | 251015 | 2.15 |
| 23 | Batchfile | 236945 | 0.70 |
| 24 | CMake | 175282 | 0.54 |
| 25 | Visual Basic | 155652 | 1.91 |
| 26 | FORTRAN | 142038 | 1.62 |
| 27 | PowerShell | 136846 | 0.69 |
| 28 | Assembly | 82905 | 0.78 |
| 29 | Julia | 58317 | 0.29 |
## Dataset Creation
The dataset was created in two steps:
1. Files of with the extensions given in the list above were retrieved from the GitHub dataset on BigQuery (full query [here](https://huggingface.co/datasets/codeparrot/github-code/blob/main/query.sql)). The query was executed on _Mar 16, 2022, 6:23:39 PM UTC+1_.
2. Files with lines longer than 1000 characters and duplicates (exact duplicates ignoring whitespaces) were dropped (full preprocessing script [here](https://huggingface.co/datasets/codeparrot/github-code/blob/main/github_preprocessing.py)).
## Considerations for Using the Data
The dataset consists of source code from a wide range of repositories. As such they can potentially include harmful or biased code as well as sensitive information like passwords or usernames.
## Releases
You can load any older version of the dataset with the `revision` argument:
```Python
ds = load_dataset("codeparrot/github-code", revision="v1.0")
```
### v1.0
- Initial release of dataset
- The query was executed on _Feb 14, 2022, 12:03:16 PM UTC+1_
### v1.1
- Fix missing Scala/TypeScript
- Fix deduplication issue with inconsistent Python `hash`
- The query was executed on _Mar 16, 2022, 6:23:39 PM UTC+1_
|
Felladrin/pretrain-webglm-qa | ---
language:
- en
license: apache-2.0
source_datasets:
- THUDM/webglm-qa
---
Conversion of [THUDM/webglm-qa](https://huggingface.co/datasets/THUDM/webglm-qa) dataset to be used in pretraining.
Python code used for conversion:
```python
from datasets import load_dataset
import pandas
import re
dataset = load_dataset("THUDM/webglm-qa", split="train")
def format(columns):
return re.sub(r'\[\d\]', '', columns["answer"].strip())
pandas.DataFrame({"text": [format(columns) for columns in dataset]}).to_csv("train.csv", index=False)
```
|
Tamazight-NLP/Pontoon-Translations | ---
configs:
- config_name: en-zgh
data_files: en-zgh.tsv
default: true
- config_name: en-kab
data_files: en-kab.tsv
- config_name: en-tzm
data_files: en-tzm.tsv
- config_name: en-shi
data_files: en-shi.tsv
license: mpl-2.0
task_categories:
- translation
- text2text-generation
language:
- ber
- zgh
- kab
- tzm
- shi
- en
size_categories:
- 10K<n<100K
pretty_name: Pontoon Translations
---
# Pontoon Translations
Amazigh subset of [Pontoon Translations](https://huggingface.co/datasets/ayymen/Pontoon-Translations). |
tttarun/llama2_hindi | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 26205181
num_examples: 3767
download_size: 8375550
dataset_size: 26205181
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
davidfant/natural-questions-chunk-27 | ---
dataset_info:
features:
- name: id
dtype: string
- name: document
struct:
- name: html
dtype: string
- name: title
dtype: string
- name: tokens
sequence:
- name: end_byte
dtype: int64
- name: is_html
dtype: bool
- name: start_byte
dtype: int64
- name: token
dtype: string
- name: url
dtype: string
- name: question
struct:
- name: text
dtype: string
- name: tokens
sequence: string
- name: long_answer_candidates
sequence:
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: top_level
dtype: bool
- name: annotations
sequence:
- name: id
dtype: string
- name: long_answer
struct:
- name: candidate_index
dtype: int64
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: short_answers
sequence:
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: text
dtype: string
- name: yes_no_answer
dtype:
class_label:
names:
'0': 'NO'
'1': 'YES'
splits:
- name: train
num_bytes: 4693704576
num_examples: 10000
download_size: 1826031793
dataset_size: 4693704576
---
# Dataset Card for "natural-questions-chunk-27"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adalib/starcoder-filtered-apis | ---
dataset_info:
features:
- name: code
dtype: string
- name: apis
sequence: string
- name: extract_api
dtype: string
splits:
- name: graphscope
num_bytes: 1369996
num_examples: 122
- name: evaluate
num_bytes: 9197241
num_examples: 690
- name: fate_flow
num_bytes: 1943717
num_examples: 119
download_size: 4272128
dataset_size: 12510954
configs:
- config_name: default
data_files:
- split: graphscope
path: data/graphscope-*
- split: evaluate
path: data/evaluate-*
- split: fate_flow
path: data/fate_flow-*
---
|
open-llm-leaderboard/details_nbeerbower__flammen8-mistral-7B | ---
pretty_name: Evaluation run of nbeerbower/flammen8-mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nbeerbower/flammen8-mistral-7B](https://huggingface.co/nbeerbower/flammen8-mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__flammen8-mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T22:49:37.512588](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__flammen8-mistral-7B/blob/main/results_2024-03-21T22-49-37.512588.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6543148554791158,\n\
\ \"acc_stderr\": 0.032162650890367815,\n \"acc_norm\": 0.6542666705080402,\n\
\ \"acc_norm_stderr\": 0.03282709834585556,\n \"mc1\": 0.5520195838433293,\n\
\ \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.7201767439915395,\n\
\ \"mc2_stderr\": 0.01452233141133894\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6902730375426621,\n \"acc_stderr\": 0.013512058415238361,\n\
\ \"acc_norm\": 0.7192832764505119,\n \"acc_norm_stderr\": 0.013131238126975576\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7034455287791277,\n\
\ \"acc_stderr\": 0.004558049018764651,\n \"acc_norm\": 0.880601473809998,\n\
\ \"acc_norm_stderr\": 0.003235941810943153\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106136,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106136\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"\
acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.02485636418450322,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.02485636418450322\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n\
\ \"acc_stderr\": 0.016536829648997112,\n \"acc_norm\": 0.42569832402234636,\n\
\ \"acc_norm_stderr\": 0.016536829648997112\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"\
acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n\
\ \"acc_stderr\": 0.012754553719781753,\n \"acc_norm\": 0.47522816166883963,\n\
\ \"acc_norm_stderr\": 0.012754553719781753\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786855,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786855\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5520195838433293,\n\
\ \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.7201767439915395,\n\
\ \"mc2_stderr\": 0.01452233141133894\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8224151539068666,\n \"acc_stderr\": 0.010740676861359237\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \
\ \"acc_stderr\": 0.012679297549515432\n }\n}\n```"
repo_url: https://huggingface.co/nbeerbower/flammen8-mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|arc:challenge|25_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|gsm8k|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hellaswag|10_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T22-49-37.512588.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T22-49-37.512588.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- '**/details_harness|winogrande|5_2024-03-21T22-49-37.512588.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T22-49-37.512588.parquet'
- config_name: results
data_files:
- split: 2024_03_21T22_49_37.512588
path:
- results_2024-03-21T22-49-37.512588.parquet
- split: latest
path:
- results_2024-03-21T22-49-37.512588.parquet
---
# Dataset Card for Evaluation run of nbeerbower/flammen8-mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nbeerbower/flammen8-mistral-7B](https://huggingface.co/nbeerbower/flammen8-mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nbeerbower__flammen8-mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T22:49:37.512588](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__flammen8-mistral-7B/blob/main/results_2024-03-21T22-49-37.512588.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6543148554791158,
"acc_stderr": 0.032162650890367815,
"acc_norm": 0.6542666705080402,
"acc_norm_stderr": 0.03282709834585556,
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.7201767439915395,
"mc2_stderr": 0.01452233141133894
},
"harness|arc:challenge|25": {
"acc": 0.6902730375426621,
"acc_stderr": 0.013512058415238361,
"acc_norm": 0.7192832764505119,
"acc_norm_stderr": 0.013131238126975576
},
"harness|hellaswag|10": {
"acc": 0.7034455287791277,
"acc_stderr": 0.004558049018764651,
"acc_norm": 0.880601473809998,
"acc_norm_stderr": 0.003235941810943153
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106136,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106136
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.02485636418450322,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.02485636418450322
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.016536829648997112,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.016536829648997112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47522816166883963,
"acc_stderr": 0.012754553719781753,
"acc_norm": 0.47522816166883963,
"acc_norm_stderr": 0.012754553719781753
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786855,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786855
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.7201767439915395,
"mc2_stderr": 0.01452233141133894
},
"harness|winogrande|5": {
"acc": 0.8224151539068666,
"acc_stderr": 0.010740676861359237
},
"harness|gsm8k|5": {
"acc": 0.6952236542835482,
"acc_stderr": 0.012679297549515432
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AhmedBou/French_quotes | ---
license: apache-2.0
task_categories:
- text-classification
- text-generation
language:
- fr
size_categories:
- 1K<n<10K
--- |
Christin/cc_news_temporal | ---
license: apache-2.0
dataset_info:
features:
- name: paragraph_1
dtype: string
- name: paragraph_2
dtype: string
- name: label
dtype: int64
- name: date_1
dtype: date32
- name: date_2
dtype: date32
splits:
- name: train
num_bytes: 21012929367
num_examples: 24834751
- name: test
num_bytes: 1167639274
num_examples: 1379708
- name: valid
num_bytes: 1168397834
num_examples: 1379709
download_size: 17713200467
dataset_size: 23348966475
---
|
shuttie/dadjokes | ---
license: apache-2.0
language:
- en
size_categories:
- 10K<n<100K
---
# Dad Jokes dataset
This dataset is generated from the [Kaggle Reddit Dad Jokes](https://www.kaggle.com/datasets/oktayozturk010/reddit-dad-jokes) by [Oktay Ozturk](https://www.kaggle.com/oktayozturk010), with the following modifications:
* Only jokes with 5+ votes were sampled. Less upvoted jokes are too cringe.
* With a set of heuristics, each joke was split into two parts: base and the punchline.
## Format
The dataset is formatted as a CSV, and is split into train/test parts:
* train: 52000 samples
* test: 1400 samples
```csv
"question","response"
"I asked my priest how he gets holy water","He said it’s just regular water, he just boils the hell out of it"
"Life Hack: If you play My Chemical Romance loud enough in your yard","your grass will cut itself"
"Why did Mr. Potato Head get pulled over","He was baked"
"How did the Mexican John Wick taste his Burrito","He took Juan Lick"
```
## Usage
With a base/punchline split, this dataset can be used for a joke prediction task with any LLM.
## License
Apache 2.0. |
distilled-from-one-sec-cv12/chunk_246 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 834227128
num_examples: 162554
download_size: 852110143
dataset_size: 834227128
---
# Dataset Card for "chunk_246"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ucinlp/drop | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- question-answering
- text2text-generation
task_ids:
- extractive-qa
- abstractive-qa
paperswithcode_id: drop
pretty_name: DROP
dataset_info:
features:
- name: section_id
dtype: string
- name: query_id
dtype: string
- name: passage
dtype: string
- name: question
dtype: string
- name: answers_spans
sequence:
- name: spans
dtype: string
- name: types
dtype: string
splits:
- name: train
num_bytes: 105572506
num_examples: 77400
- name: validation
num_bytes: 11737755
num_examples: 9535
download_size: 11538387
dataset_size: 117310261
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "drop"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://allenai.org/data/drop
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** https://aclanthology.org/N19-1246/
- **Paper:** https://arxiv.org/abs/1903.00161
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 8.30 MB
- **Size of the generated dataset:** 110.91 MB
- **Total amount of disk used:** 119.21 MB
### Dataset Summary
DROP: A Reading Comprehension Benchmark Requiring Discrete Reasoning Over Paragraphs.
. DROP is a crowdsourced, adversarially-created, 96k-question benchmark, in which a system must resolve references in a
question, perhaps to multiple input positions, and perform discrete operations over them (such as addition, counting, or
sorting). These operations require a much more comprehensive understanding of the content of paragraphs than what was
necessary for prior datasets.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 8.30 MB
- **Size of the generated dataset:** 110.91 MB
- **Total amount of disk used:** 119.21 MB
An example of 'validation' looks as follows.
```
This example was too long and was cropped:
{
"answers_spans": {
"spans": ["Chaz Schilens"]
},
"passage": "\" Hoping to rebound from their loss to the Patriots, the Raiders stayed at home for a Week 16 duel with the Houston Texans. Oak...",
"question": "Who scored the first touchdown of the game?"
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `passage`: a `string` feature.
- `question`: a `string` feature.
- `answers_spans`: a dictionary feature containing:
- `spans`: a `string` feature.
### Data Splits
| name |train|validation|
|-------|----:|---------:|
|default|77409| 9536|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{Dua2019DROP,
author={Dheeru Dua and Yizhong Wang and Pradeep Dasigi and Gabriel Stanovsky and Sameer Singh and Matt Gardner},
title={ {DROP}: A Reading Comprehension Benchmark Requiring Discrete Reasoning Over Paragraphs},
booktitle={Proc. of NAACL},
year={2019}
}
```
### Contributions
Thanks to [@patrickvonplaten](https://github.com/patrickvonplaten), [@thomwolf](https://github.com/thomwolf), [@mariamabarham](https://github.com/mariamabarham), [@lewtun](https://github.com/lewtun) for adding this dataset. |
Hack90/ncbi_genbank_part_34 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
splits:
- name: train
num_bytes: 33614209063
num_examples: 62458
download_size: 15102741772
dataset_size: 33614209063
---
# Dataset Card for "ncbi_genbank_part_34"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Aiden07/dota2_instruct_prompt | ---
license: mit
task_categories:
- question-answering
language:
- en
size_categories:
- 1K<n<10K
---
Instruction-answer dataset generated with GPT 3.5 Turbo using (html) data scrapped from [fandom wiki](https://dota2.fandom.com/wiki/Dota_2_Wiki). Data includes the following topics:
- Heroes
- Background lore
- Attributes / Stats
- Abilities
- Talents
- Runes
- Buildings
- Items
- Gameplay mechanics
- Creeps
Pending enhancement:
- Data cleaning/preprocessing before fed into GPT 3.5 Turbo for instruction-answer set generation
- Strategy data of each hero, i.e. guide to using each hero
- Individual items' properties
- Types of creeps in details
- Types of runes and buildings in details |
communityai/ise-uiuc___Magicoder-OSS-Instruct-75K | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 397520573.0
num_examples: 72403
download_size: 160164505
dataset_size: 397520573.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.