datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
c-demartino/llama-2-7b-chat-paragraphs | ---
license: apache-2.0
---
|
dmayhem93/agieval-aqua-rat | ---
dataset_info:
features:
- name: query
dtype: string
- name: choices
sequence: string
- name: gold
sequence: int64
splits:
- name: test
num_bytes: 93696
num_examples: 254
download_size: 0
dataset_size: 93696
license: apache-2.0
---
# Dataset Card for "agieval-aqua-rat"
Dataset taken from https://github.com/microsoft/AGIEval and processed as in that repo.
Raw dataset: https://github.com/deepmind/AQuA
Copyright 2017 Google Inc.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
@misc{zhong2023agieval,
title={AGIEval: A Human-Centric Benchmark for Evaluating Foundation Models},
author={Wanjun Zhong and Ruixiang Cui and Yiduo Guo and Yaobo Liang and Shuai Lu and Yanlin Wang and Amin Saied and Weizhu Chen and Nan Duan},
year={2023},
eprint={2304.06364},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@inproceedings{ling-etal-2017-program,
title = "Program Induction by Rationale Generation: Learning to Solve and Explain Algebraic Word Problems",
author = "Ling, Wang and
Yogatama, Dani and
Dyer, Chris and
Blunsom, Phil",
booktitle = "Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2017",
address = "Vancouver, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/P17-1015",
doi = "10.18653/v1/P17-1015",
pages = "158--167",
abstract = "Solving algebraic word problems requires executing a series of arithmetic operations{---}a program{---}to obtain a final answer. However, since programs can be arbitrarily complicated, inducing them directly from question-answer pairs is a formidable challenge. To make this task more feasible, we solve these problems by generating answer rationales, sequences of natural language and human-readable mathematical expressions that derive the final answer through a series of small steps. Although rationales do not explicitly specify programs, they provide a scaffolding for their structure via intermediate milestones. To evaluate our approach, we have created a new 100,000-sample dataset of questions, answers and rationales. Experimental results show that indirect supervision of program learning via answer rationales is a promising strategy for inducing arithmetic programs.",
} |
distilled-from-one-sec-cv12/chunk_13 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1196705420
num_examples: 233185
download_size: 1214418696
dataset_size: 1196705420
---
# Dataset Card for "chunk_13"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carnival13/massive_val_DA5_tokenized | ---
dataset_info:
features:
- name: pass_label
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 16518310
num_examples: 24160
download_size: 3778628
dataset_size: 16518310
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "massive_val_DA5_tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nerfgun3/torino_art | ---
language:
- en
tags:
- stable-diffusion
- text-to-image
license: creativeml-openrail-m
inference: false
---
# Torino Artist Embedding / Textual Inversion
## Usage
To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
To use it in a prompt: ```"drawn by torino_art"```
If it is to strong just add [] around it.
Trained until 12800 steps
Have fun :)
## Example Pictures
<table>
<tr>
<td><img src=https://i.imgur.com/xnRZgRb.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/AcHsCMX.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/egIlKhy.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/nZQh3da.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/V9UFqn2.png width=100% height=100%/></td>
</tr>
</table>
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
Dampish/3k-Instruction-Questions | ---
license: cc-by-nc-4.0
---
|
Multimodal-Fatima/DTD_parition1_train_embeddings | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: int64
- name: vision_embeddings
sequence: float32
splits:
- name: openai_clip_vit_large_patch14
num_bytes: 236557256.4
num_examples: 1880
download_size: 237044519
dataset_size: 236557256.4
---
# Dataset Card for "DTD_parition1_train_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FinGPT/fingpt-sentiment-train | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 18860715
num_examples: 76772
download_size: 6417302
dataset_size: 18860715
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "fingpt-sentiment-train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SaeedRahmani/codeparrot_github_code_powershell | ---
dataset_info:
features:
- name: code
dtype: string
- name: repo_name
dtype: string
- name: path
dtype: string
- name: language
dtype: string
- name: license
dtype: string
- name: size
dtype: int64
splits:
- name: train
num_bytes: 1156043863
num_examples: 140000
download_size: 392578861
dataset_size: 1156043863
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
allenai/lila | ---
license: cc-by-4.0
---
## Dataset Description
- **Repository:** [allenai/lila](https://github.com/allenai/lila)
- **Paper:** [LILA: A Unified Benchmark for Mathematical Reasoning](https://aclanthology.org/2022.emnlp-main.392.pdf)
- **Point of Contact:** [Matthew Finlayson](https://mattf1n.github.io/), [Sean Welleck](https://wellecks.com/)
# Lila: A Unified Benchmark for Mathematical Reasoning
- **Homepage: https://lila.apps.allenai.org/**
- **Repository: https://github.com/allenai/lila**
- **Paper: https://aclanthology.org/2022.emnlp-main.392.pdf**
### Licensing Information
Creative Commons Attribution 4.0 International
### Citation Information
Cite this dataset and the source datasets (see [sources.bib](https://github.com/allenai/Lila/blob/main/sources.bib)).
```bib
@INPROCEEDINGS{Mishra2022Lila,
author = {
Swaroop Mishra
and Matthew Finlayson
and Pan Lu
and Leonard Tang
and Sean Welleck
and Chitta Baral
and Tanmay Rajpurohit
and Oyvind Tafjord
and Ashish Sabharwal
and Peter Clark
and Ashwin Kalyan},
title = {Lila: A Unified Benchmark for Mathematical Reasoning},
booktitle = {Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing (EMNLP)},
year = {2022}
}
```
|
Seanxh/twitter_dataset_1713198236 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 92942
num_examples: 216
download_size: 37027
dataset_size: 92942
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
freshpearYoon/train_free_20 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604553880
num_examples: 10000
download_size: 1246793614
dataset_size: 9604553880
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zhangyi617/AE_adversarial_train_prompt_all_origin | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 78132033.0
num_examples: 180
download_size: 78131456
dataset_size: 78132033.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bfxwayne/data-docs | ---
license: apache-2.0
---
|
Falah/movie_action_prompts_SDXL | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 301897024
num_examples: 1000000
download_size: 34184719
dataset_size: 301897024
---
# Dataset Card for "movie_action_prompts_SDXL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_KeyonZeng__philion-2 | ---
pretty_name: Evaluation run of KeyonZeng/philion-2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KeyonZeng/philion-2](https://huggingface.co/KeyonZeng/philion-2) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KeyonZeng__philion-2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-25T05:15:42.641608](https://huggingface.co/datasets/open-llm-leaderboard/details_KeyonZeng__philion-2/blob/main/results_2024-03-25T05-15-42.641608.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5826623781651874,\n\
\ \"acc_stderr\": 0.03359326274407904,\n \"acc_norm\": 0.5846963354817044,\n\
\ \"acc_norm_stderr\": 0.034276494899318236,\n \"mc1\": 0.31701346389228885,\n\
\ \"mc1_stderr\": 0.01628920337440338,\n \"mc2\": 0.4447100247374194,\n\
\ \"mc2_stderr\": 0.014982640206881327\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5819112627986348,\n \"acc_stderr\": 0.014413988396996076,\n\
\ \"acc_norm\": 0.6160409556313993,\n \"acc_norm_stderr\": 0.014212444980651894\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5612427803226449,\n\
\ \"acc_stderr\": 0.004952209831856575,\n \"acc_norm\": 0.7506472814180443,\n\
\ \"acc_norm_stderr\": 0.004317541575275679\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.042849586397533994,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.042849586397533994\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.032662042990646796,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.032662042990646796\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382182,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382182\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7032258064516129,\n\
\ \"acc_stderr\": 0.02598850079241189,\n \"acc_norm\": 0.7032258064516129,\n\
\ \"acc_norm_stderr\": 0.02598850079241189\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819115,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819115\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124495,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124495\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.028408953626245282,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.028408953626245282\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.02504919787604234,\n \
\ \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.02504919787604234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.0284934650910286,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.0284934650910286\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.031693802357129965,\n\
\ \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.031693802357129965\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7981651376146789,\n \"acc_stderr\": 0.017208579357787575,\n \"\
acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.017208579357787575\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.033086111132364364,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033086111132364364\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.04039314978724561,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.04039314978724561\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912046,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.5535714285714286,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483706,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483706\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.698595146871009,\n\
\ \"acc_stderr\": 0.016409091097268787,\n \"acc_norm\": 0.698595146871009,\n\
\ \"acc_norm_stderr\": 0.016409091097268787\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.025190181327608415,\n\
\ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.025190181327608415\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n\
\ \"acc_stderr\": 0.014912413096372434,\n \"acc_norm\": 0.2737430167597765,\n\
\ \"acc_norm_stderr\": 0.014912413096372434\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159617,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159617\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.02760468902858199,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.02760468902858199\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6234567901234568,\n \"acc_stderr\": 0.02695934451874778,\n\
\ \"acc_norm\": 0.6234567901234568,\n \"acc_norm_stderr\": 0.02695934451874778\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41851368970013036,\n\
\ \"acc_stderr\": 0.012599505608336463,\n \"acc_norm\": 0.41851368970013036,\n\
\ \"acc_norm_stderr\": 0.012599505608336463\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.03034326422421352,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.03034326422421352\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5637254901960784,\n \"acc_stderr\": 0.02006287424353913,\n \
\ \"acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.02006287424353913\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.02768691358801301,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.02768691358801301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.035469769593931624,\n\
\ \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.035469769593931624\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31701346389228885,\n\
\ \"mc1_stderr\": 0.01628920337440338,\n \"mc2\": 0.4447100247374194,\n\
\ \"mc2_stderr\": 0.014982640206881327\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7426992896606156,\n \"acc_stderr\": 0.012285989618865708\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5261561789234268,\n \
\ \"acc_stderr\": 0.013753627037255044\n }\n}\n```"
repo_url: https://huggingface.co/KeyonZeng/philion-2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|arc:challenge|25_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|gsm8k|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hellaswag|10_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T05-15-42.641608.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-25T05-15-42.641608.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- '**/details_harness|winogrande|5_2024-03-25T05-15-42.641608.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-25T05-15-42.641608.parquet'
- config_name: results
data_files:
- split: 2024_03_25T05_15_42.641608
path:
- results_2024-03-25T05-15-42.641608.parquet
- split: latest
path:
- results_2024-03-25T05-15-42.641608.parquet
---
# Dataset Card for Evaluation run of KeyonZeng/philion-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KeyonZeng/philion-2](https://huggingface.co/KeyonZeng/philion-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KeyonZeng__philion-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-25T05:15:42.641608](https://huggingface.co/datasets/open-llm-leaderboard/details_KeyonZeng__philion-2/blob/main/results_2024-03-25T05-15-42.641608.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5826623781651874,
"acc_stderr": 0.03359326274407904,
"acc_norm": 0.5846963354817044,
"acc_norm_stderr": 0.034276494899318236,
"mc1": 0.31701346389228885,
"mc1_stderr": 0.01628920337440338,
"mc2": 0.4447100247374194,
"mc2_stderr": 0.014982640206881327
},
"harness|arc:challenge|25": {
"acc": 0.5819112627986348,
"acc_stderr": 0.014413988396996076,
"acc_norm": 0.6160409556313993,
"acc_norm_stderr": 0.014212444980651894
},
"harness|hellaswag|10": {
"acc": 0.5612427803226449,
"acc_stderr": 0.004952209831856575,
"acc_norm": 0.7506472814180443,
"acc_norm_stderr": 0.004317541575275679
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.042849586397533994,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.042849586397533994
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.032662042990646796,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.032662042990646796
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382182,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382182
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7032258064516129,
"acc_stderr": 0.02598850079241189,
"acc_norm": 0.7032258064516129,
"acc_norm_stderr": 0.02598850079241189
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.037131580674819115,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.037131580674819115
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.028408953626245282,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.028408953626245282
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.02504919787604234,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.02504919787604234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.0284934650910286,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.0284934650910286
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.031693802357129965,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.031693802357129965
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.017208579357787575,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.017208579357787575
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033086111132364364,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033086111132364364
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483706,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483706
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.698595146871009,
"acc_stderr": 0.016409091097268787,
"acc_norm": 0.698595146871009,
"acc_norm_stderr": 0.016409091097268787
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.025190181327608415,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.025190181327608415
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372434,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372434
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159617,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159617
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.02760468902858199,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.02760468902858199
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6234567901234568,
"acc_stderr": 0.02695934451874778,
"acc_norm": 0.6234567901234568,
"acc_norm_stderr": 0.02695934451874778
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41851368970013036,
"acc_stderr": 0.012599505608336463,
"acc_norm": 0.41851368970013036,
"acc_norm_stderr": 0.012599505608336463
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.03034326422421352,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.03034326422421352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5637254901960784,
"acc_stderr": 0.02006287424353913,
"acc_norm": 0.5637254901960784,
"acc_norm_stderr": 0.02006287424353913
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801301,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31701346389228885,
"mc1_stderr": 0.01628920337440338,
"mc2": 0.4447100247374194,
"mc2_stderr": 0.014982640206881327
},
"harness|winogrande|5": {
"acc": 0.7426992896606156,
"acc_stderr": 0.012285989618865708
},
"harness|gsm8k|5": {
"acc": 0.5261561789234268,
"acc_stderr": 0.013753627037255044
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
khalidalt/tydiqa-primary | ---
pretty_name: TyDi QA
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
- ar
- bn
- fi
- id
- ja
- sw
- ko
- ru
- te
- th
license:
- apache-2.0
multilinguality:
- multilingual
size_categories:
- unknown
source_datasets:
- extended|wikipedia
task_categories:
- question-answering
task_ids:
- extractive-qa
paperswithcode_id: tydi-qa
---
# Dataset Card for "tydiqa"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/google-research-datasets/tydiqa](https://github.com/google-research-datasets/tydiqa)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 3726.74 MB
- **Size of the generated dataset:** 5812.92 MB
- **Total amount of disk used:** 9539.67 MB
### Dataset Summary
TyDi QA is a question answering dataset covering 11 typologically diverse languages with 204K question-answer pairs.
The languages of TyDi QA are diverse with regard to their typology -- the set of linguistic features that each language
expresses -- such that we expect models performing well on this set to generalize across a large number of the languages
in the world. It contains language phenomena that would not be found in English-only corpora. To provide a realistic
information-seeking task and avoid priming effects, questions are written by people who want to know the answer, but
don’t know the answer yet, (unlike SQuAD and its descendents) and the data is collected directly in each language without
the use of translation (unlike MLQA and XQuAD).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### primary_task
- **Size of downloaded dataset files:** 1863.37 MB
- **Size of the generated dataset:** 5757.59 MB
- **Total amount of disk used:** 7620.96 MB
An example of 'validation' looks as follows.
```
This example was too long and was cropped:
{
"annotations": {
"minimal_answers_end_byte": [-1, -1, -1],
"minimal_answers_start_byte": [-1, -1, -1],
"passage_answer_candidate_index": [-1, -1, -1],
"yes_no_answer": ["NONE", "NONE", "NONE"]
},
"document_plaintext": "\"\\nรองศาสตราจารย์[1] หม่อมราชวงศ์สุขุมพันธุ์ บริพัตร (22 กันยายน 2495 -) ผู้ว่าราชการกรุงเทพมหานครคนที่ 15 อดีตรองหัวหน้าพรรคปร...",
"document_title": "หม่อมราชวงศ์สุขุมพันธุ์ บริพัตร",
"document_url": "\"https://th.wikipedia.org/wiki/%E0%B8%AB%E0%B8%A1%E0%B9%88%E0%B8%AD%E0%B8%A1%E0%B8%A3%E0%B8%B2%E0%B8%8A%E0%B8%A7%E0%B8%87%E0%B8%...",
"language": "thai",
"passage_answer_candidates": "{\"plaintext_end_byte\": [494, 1779, 2931, 3904, 4506, 5588, 6383, 7122, 8224, 9375, 10473, 12563, 15134, 17765, 19863, 21902, 229...",
"question_text": "\"หม่อมราชวงศ์สุขุมพันธุ์ บริพัตร เรียนจบจากที่ไหน ?\"..."
}
```
### Data Fields
The data fields are the same among all splits.
#### primary_task
- `passage_answer_candidates`: a dictionary feature containing:
- `plaintext_start_byte`: a `int32` feature.
- `plaintext_end_byte`: a `int32` feature.
- `question_text`: a `string` feature.
- `document_title`: a `string` feature.
- `language`: a `string` feature.
- `annotations`: a dictionary feature containing:
- `passage_answer_candidate_index`: a `int32` feature.
- `minimal_answers_start_byte`: a `int32` feature.
- `minimal_answers_end_byte`: a `int32` feature.
- `yes_no_answer`: a `string` feature.
- `document_plaintext`: a `string` feature.
- `document_url`: a `string` feature.
### Data Splits
| name | train | validation |
| -------------- | -----: | ---------: |
| primary_task | 166916 | 18670 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{tydiqa,
title = {TyDi QA: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages},
author = {Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki}
year = {2020},
journal = {Transactions of the Association for Computational Linguistics}
}
```
```
@inproceedings{ruder-etal-2021-xtreme,
title = "{XTREME}-{R}: Towards More Challenging and Nuanced Multilingual Evaluation",
author = "Ruder, Sebastian and
Constant, Noah and
Botha, Jan and
Siddhant, Aditya and
Firat, Orhan and
Fu, Jinlan and
Liu, Pengfei and
Hu, Junjie and
Garrette, Dan and
Neubig, Graham and
Johnson, Melvin",
booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
month = nov,
year = "2021",
address = "Online and Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.emnlp-main.802",
doi = "10.18653/v1/2021.emnlp-main.802",
pages = "10215--10245",
}
}
```
|
nielsr/breast-cancer | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 42431652.0
num_examples: 130
download_size: 0
dataset_size: 42431652.0
---
# Dataset Card for "breast-cancer"
Dataset was taken from the MedSAM project and used in [this notebook](https://github.com/NielsRogge/Transformers-Tutorials/blob/master/SAM/Fine_tune_SAM_(segment_anything)_on_a_custom_dataset.ipynb) which fine-tunes Meta's SAM model on the dataset.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibranze/araproje_hellaswag_en_s1 | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 149738.0
num_examples: 250
download_size: 0
dataset_size: 149738.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_en_s1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Bowlen/Sullivan | ---
license: openrail
---
|
davanstrien/autotrain-data-color-image-dating | Invalid username or password. |
liangzid/contract_types_sampled_200 | ---
license: mit
---
|
ylacombe/mls-eng-10k-tags_tagged_10k | ---
dataset_info:
features:
- name: original_path
dtype: string
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: audio_duration
dtype: float64
- name: speaker_id
dtype: string
- name: book_id
dtype: string
- name: utterance_pitch_mean
dtype: float32
- name: utterance_pitch_std
dtype: float32
- name: snr
dtype: float64
- name: c50
dtype: float64
- name: speaking_rate
dtype: string
- name: phonemes
dtype: string
- name: gender
dtype: string
- name: pitch
dtype: string
- name: noise
dtype: string
- name: reverberation
dtype: string
- name: speech_monotony
dtype: string
- name: original_text
dtype: string
- name: text
dtype: string
splits:
- name: dev
num_bytes: 3668471
num_examples: 3807
- name: test
num_bytes: 3646267
num_examples: 3769
- name: train
num_bytes: 2336493497
num_examples: 2420047
download_size: 1319807391
dataset_size: 2343808235
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
- split: train
path: data/train-*
---
|
Nexdata/Russian_Speaking_English_Speech_Data_by_Mobile_Phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Russian_Speaking_English_Speech_Data_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1042?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset is recorded by 498 native Russian speakers with a balanced gender. It is rich in content and it covers generic command and control;human-machine interaction; smart home command and control;in-car command and control categories. The transcription corpus has been manually proofread to ensure high accuracy.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1042?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Russian English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
AlvianKhairi/my-pandas-dataset-Abstract_Link | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 455609414
num_examples: 552066
download_size: 173420444
dataset_size: 455609414
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "my-pandas-dataset-Abstract_Link"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_202 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1191183388
num_examples: 232109
download_size: 1217752107
dataset_size: 1191183388
---
# Dataset Card for "chunk_202"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
communityai/gretelai___synthetic_text_to_sql-10k | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 8430293.9
num_examples: 10000
download_size: 3007700
dataset_size: 8430293.9
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cheafdevo56/Influential_NonCitedNegs_10_Percent_large | ---
dataset_info:
features:
- name: query
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: title
dtype: string
- name: pos
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: title
dtype: string
- name: neg
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: score
dtype: int64
- name: title
dtype: string
splits:
- name: train
num_bytes: 350265367.8
num_examples: 90000
- name: validation
num_bytes: 38918374.2
num_examples: 10000
download_size: 233747530
dataset_size: 389183742.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
LHF/escorpius-m | ---
license: cc-by-nc-nd-4.0
language:
- af
- ar
- bn
- ca
- cs
- da
- de
- el
- eu
- fa
- fi
- fr
- gl
- hi
- hr
- it
- ja
- ko
- mt
- nl
- no
- oc
- pa
- pl
- pt
- ro
- sl
- sr
- sv
- tr
- uk
- ur
multilinguality:
- multilingual
size_categories:
- 100B<n<1T
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
---
# esCorpius Multilingual
In the recent years, Transformer-based models have lead to significant advances in language modelling for natural language processing. However, they require a vast amount of data to be (pre-)trained and there is a lack of corpora in languages other than English. Recently, several initiatives have presented multilingual datasets obtained from automatic web crawling. However, they present important shortcomings for languages different from English, as they are either too small, or present a low quality derived from sub-optimal cleaning and deduplication. In this repository, we introduce esCorpius-m, a multilingual crawling corpus obtained from near 1 Pb of Common Crawl data. It is the most extensive corpus in some of the languages covered with this level of quality in the extraction, purification and deduplication of web textual content. Our data curation process involves a novel highly parallel cleaning pipeline and encompasses a series of deduplication mechanisms that together ensure the integrity of both document and paragraph boundaries. Additionally, we maintain both the source web page URL and the WARC shard origin URL in order to complain with EU regulations. esCorpius-m has been released under CC BY-NC-ND 4.0 license.
## Usage
Replace `revision` with the language of your choice (in this case, `it` for Italian):
```
dataset = load_dataset('LHF/escorpius-m', split='train', streaming=True, revision='it')
```
## Other corpora
- esCorpius-mr multilingual *raw* corpus (not deduplicated): https://huggingface.co/datasets/LHF/escorpius-mr
- esCorpius original *Spanish only* corpus (deduplicated): https://huggingface.co/datasets/LHF/escorpius
## Citation
Link to paper: https://www.isca-speech.org/archive/pdfs/iberspeech_2022/gutierrezfandino22_iberspeech.pdf / https://arxiv.org/abs/2206.15147
Cite this work:
```
@inproceedings{gutierrezfandino22_iberspeech,
author={Asier Gutiérrez-Fandiño and David Pérez-Fernández and Jordi Armengol-Estapé and David Griol and Zoraida Callejas},
title={{esCorpius: A Massive Spanish Crawling Corpus}},
keywords = {Computation and Language (cs.CL), Artificial Intelligence (cs.AI), FOS: Computer and information sciences, FOS: Computer and information sciences},
year=2022,
booktitle={Proc. IberSPEECH 2022},
pages={126--130},
doi={10.21437/IberSPEECH.2022-26}
}
```
## Disclaimer
We did not perform any kind of filtering and/or censorship to the corpus. We expect users to do so applying their own methods. We are not liable for any misuse of the corpus.
|
kvpratama/pokemon-images-dataset | ---
license: apache-2.0
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 41347049
num_examples: 819
download_size: 41350027
dataset_size: 41347049
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- image-to-image
language:
- en
size_categories:
- n<1K
---
# Dataset Card for pokemon-images-dataset
### Dataset Summary
A collection of images featuring Pokémon characters.
## Dataset Creation
### Context
I collected this dataset for my school project. The project is to train GAN to generate new Pokemon. I had a difficult time finding a training dataset that is complete and clean. So I gather this collection of images and publish it here hoping that it will help others who need a similar dataset.
You can find my project on my [Github][1]
My latest code to generate pokemon [Github][4]
### Content
819 transparent Pokemon images in png format size 256x256.
* Update August 10, 2020
819 white background in jpg format
### Acknowledgements
I collected the image mostly from this website [https://veekun.com/dex/downloads][2]
Banner image is taken from [https://viking011.deviantart.com/art/Pokemon-Poster-436455502][3]
### Inspiration
Since I failed to generate new Pokemon with clarity (I can only generate the shape) I wish there will be others that could do it with this dataset. If you managed to, please share it!
[1]: https://github.com/kvpratama/gan/tree/master/pokemon
[2]: https://veekun.com/dex/downloads
[3]: https://viking011.deviantart.com/art/Pokemon-Poster-436455502
[4]: https://github.com/kvpratama/gan/tree/master/pokemon_dcgan
|
sdiazlor/data-drift-simulation-dataset | ---
dataset_info:
features:
- name: review
dtype: string
- name: rating
dtype: float64
- name: datetime
dtype: timestamp[ns]
- name: rewritten_reviews
dtype: string
splits:
- name: train
num_bytes: 303505
num_examples: 300
download_size: 184323
dataset_size: 303505
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
iwahith/arrow_dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 29324
num_examples: 13
download_size: 14904
dataset_size: 29324
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "arrow_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_148 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 971416116
num_examples: 190773
download_size: 989563451
dataset_size: 971416116
---
# Dataset Card for "chunk_148"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_Diabetes130US_sgosdt_l256_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 174960000
num_examples: 10000
- name: validation
num_bytes: 174960000
num_examples: 10000
download_size: 45382840
dataset_size: 349920000
---
# Dataset Card for "autotree_automl_Diabetes130US_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_47 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1180412164
num_examples: 231817
download_size: 1199989787
dataset_size: 1180412164
---
# Dataset Card for "chunk_47"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TheBloke__medalpaca-13B-GPTQ-4bit | ---
pretty_name: Evaluation run of TheBloke/medalpaca-13B-GPTQ-4bit
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/medalpaca-13B-GPTQ-4bit](https://huggingface.co/TheBloke/medalpaca-13B-GPTQ-4bit)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__medalpaca-13B-GPTQ-4bit_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-07T11:22:05.804023](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__medalpaca-13B-GPTQ-4bit_public/blob/main/results_2023-11-07T11-22-05.804023.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.06973573825503356,\n\
\ \"em_stderr\": 0.0026083779557512714,\n \"f1\": 0.12751992449664398,\n\
\ \"f1_stderr\": 0.0028759868015646797,\n \"acc\": 0.26558800315706393,\n\
\ \"acc_stderr\": 0.00701257132031976\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.06973573825503356,\n \"em_stderr\": 0.0026083779557512714,\n\
\ \"f1\": 0.12751992449664398,\n \"f1_stderr\": 0.0028759868015646797\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5311760063141279,\n\
\ \"acc_stderr\": 0.01402514264063952\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/medalpaca-13B-GPTQ-4bit
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_05T14_02_24.762310
path:
- '**/details_harness|drop|3_2023-11-05T14-02-24.762310.parquet'
- split: 2023_11_07T11_22_05.804023
path:
- '**/details_harness|drop|3_2023-11-07T11-22-05.804023.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-07T11-22-05.804023.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_05T14_02_24.762310
path:
- '**/details_harness|gsm8k|5_2023-11-05T14-02-24.762310.parquet'
- split: 2023_11_07T11_22_05.804023
path:
- '**/details_harness|gsm8k|5_2023-11-07T11-22-05.804023.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-07T11-22-05.804023.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_05T14_02_24.762310
path:
- '**/details_harness|winogrande|5_2023-11-05T14-02-24.762310.parquet'
- split: 2023_11_07T11_22_05.804023
path:
- '**/details_harness|winogrande|5_2023-11-07T11-22-05.804023.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-07T11-22-05.804023.parquet'
- config_name: results
data_files:
- split: 2023_11_05T14_02_24.762310
path:
- results_2023-11-05T14-02-24.762310.parquet
- split: 2023_11_07T11_22_05.804023
path:
- results_2023-11-07T11-22-05.804023.parquet
- split: latest
path:
- results_2023-11-07T11-22-05.804023.parquet
---
# Dataset Card for Evaluation run of TheBloke/medalpaca-13B-GPTQ-4bit
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/medalpaca-13B-GPTQ-4bit
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/medalpaca-13B-GPTQ-4bit](https://huggingface.co/TheBloke/medalpaca-13B-GPTQ-4bit) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__medalpaca-13B-GPTQ-4bit_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-07T11:22:05.804023](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__medalpaca-13B-GPTQ-4bit_public/blob/main/results_2023-11-07T11-22-05.804023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.06973573825503356,
"em_stderr": 0.0026083779557512714,
"f1": 0.12751992449664398,
"f1_stderr": 0.0028759868015646797,
"acc": 0.26558800315706393,
"acc_stderr": 0.00701257132031976
},
"harness|drop|3": {
"em": 0.06973573825503356,
"em_stderr": 0.0026083779557512714,
"f1": 0.12751992449664398,
"f1_stderr": 0.0028759868015646797
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5311760063141279,
"acc_stderr": 0.01402514264063952
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dbaek111/customdata | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7280
num_examples: 40
download_size: 5764
dataset_size: 7280
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
colab-account/lotd-models | ---
license: wtfpl
---
|
idning/ffhq256-caption | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 7388635414.0
num_examples: 70000
download_size: 7386868493
dataset_size: 7388635414.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
StoneSeller/twitter_raw | ---
dataset_info:
features:
- name: index
dtype: int64
- name: Q
dtype: string
- name: A
dtype: string
splits:
- name: train
num_bytes: 2149019
num_examples: 10607
- name: valid
num_bytes: 478895
num_examples: 2652
download_size: 1304645
dataset_size: 2627914
---
# Dataset Card for "twitter_raw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
temasarkisov/MinimalLogos_converted_processed_V2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1063489.0
num_examples: 55
download_size: 1059361
dataset_size: 1063489.0
---
# Dataset Card for "MinimalLogos_converted_processed_V2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/Hatefulmemes_test_google_flan_t5_xxl_mode_C_A_OCR_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 1016729
num_examples: 1000
download_size: 164564
dataset_size: 1016729
---
# Dataset Card for "Hatefulmemes_test_google_flan_t5_xxl_mode_C_A_OCR_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CabraVC/vector_dataset_roberta-fine-tuned | ---
dataset_info:
features:
- name: texts
dtype: string
- name: labels
dtype:
class_label:
names:
'0': BUY
'1': HOLD
'2': SELL
- name: embeddings
sequence: float64
splits:
- name: train
num_bytes: 30663495.772859924
num_examples: 3289
- name: val
num_bytes: 3831771.590953307
num_examples: 411
- name: test
num_bytes: 3841094.6361867706
num_examples: 412
download_size: 27783754
dataset_size: 38336362.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-19000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1008125
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
PhilSad/Instruct-fr-merged-35k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 17414551.887416366
num_examples: 35000
download_size: 8991276
dataset_size: 17414551.887416366
---
# Dataset Card for "Instruct-fr-merged-35k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_cognitivecomputations__dolphin-2.6-mistral-7b-dpo-laser | ---
pretty_name: Evaluation run of cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser](https://huggingface.co/cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cognitivecomputations__dolphin-2.6-mistral-7b-dpo-laser\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-06T08:55:09.441353](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.6-mistral-7b-dpo-laser/blob/main/results_2024-01-06T08-55-09.441353.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6321651928198004,\n\
\ \"acc_stderr\": 0.03241329296366643,\n \"acc_norm\": 0.635985368424325,\n\
\ \"acc_norm_stderr\": 0.03305944195752434,\n \"mc1\": 0.4467564259485924,\n\
\ \"mc1_stderr\": 0.017403977522557144,\n \"mc2\": 0.6171088183728592,\n\
\ \"mc2_stderr\": 0.015045730588189423\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.628839590443686,\n \"acc_stderr\": 0.01411797190114282,\n\
\ \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902274\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.662617008564031,\n\
\ \"acc_stderr\": 0.0047185047710837655,\n \"acc_norm\": 0.8572993427604063,\n\
\ \"acc_norm_stderr\": 0.0034905249650619067\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469553,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469553\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.02466674491518721,\n \
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.02466674491518721\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823019,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823019\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099857,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099857\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134135,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134135\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38324022346368714,\n\
\ \"acc_stderr\": 0.016260159604429128,\n \"acc_norm\": 0.38324022346368714,\n\
\ \"acc_norm_stderr\": 0.016260159604429128\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44198174706649285,\n\
\ \"acc_stderr\": 0.01268397251359881,\n \"acc_norm\": 0.44198174706649285,\n\
\ \"acc_norm_stderr\": 0.01268397251359881\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.02916312857067073,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.02916312857067073\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495155,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495155\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744546,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744546\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4467564259485924,\n\
\ \"mc1_stderr\": 0.017403977522557144,\n \"mc2\": 0.6171088183728592,\n\
\ \"mc2_stderr\": 0.015045730588189423\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987729\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4761182714177407,\n \
\ \"acc_stderr\": 0.013756765835465753\n }\n}\n```"
repo_url: https://huggingface.co/cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|arc:challenge|25_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|arc:challenge|25_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|gsm8k|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|gsm8k|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hellaswag|10_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hellaswag|10_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T05-06-52.185806.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T08-55-09.441353.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T08-55-09.441353.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- '**/details_harness|winogrande|5_2024-01-06T05-06-52.185806.parquet'
- split: 2024_01_06T08_55_09.441353
path:
- '**/details_harness|winogrande|5_2024-01-06T08-55-09.441353.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-06T08-55-09.441353.parquet'
- config_name: results
data_files:
- split: 2024_01_06T05_06_52.185806
path:
- results_2024-01-06T05-06-52.185806.parquet
- split: 2024_01_06T08_55_09.441353
path:
- results_2024-01-06T08-55-09.441353.parquet
- split: latest
path:
- results_2024-01-06T08-55-09.441353.parquet
---
# Dataset Card for Evaluation run of cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser](https://huggingface.co/cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cognitivecomputations__dolphin-2.6-mistral-7b-dpo-laser",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-06T08:55:09.441353](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.6-mistral-7b-dpo-laser/blob/main/results_2024-01-06T08-55-09.441353.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6321651928198004,
"acc_stderr": 0.03241329296366643,
"acc_norm": 0.635985368424325,
"acc_norm_stderr": 0.03305944195752434,
"mc1": 0.4467564259485924,
"mc1_stderr": 0.017403977522557144,
"mc2": 0.6171088183728592,
"mc2_stderr": 0.015045730588189423
},
"harness|arc:challenge|25": {
"acc": 0.628839590443686,
"acc_stderr": 0.01411797190114282,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902274
},
"harness|hellaswag|10": {
"acc": 0.662617008564031,
"acc_stderr": 0.0047185047710837655,
"acc_norm": 0.8572993427604063,
"acc_norm_stderr": 0.0034905249650619067
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469553,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469553
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.02466674491518721,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.02466674491518721
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823019,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823019
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099857,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099857
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134135,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134135
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38324022346368714,
"acc_stderr": 0.016260159604429128,
"acc_norm": 0.38324022346368714,
"acc_norm_stderr": 0.016260159604429128
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44198174706649285,
"acc_stderr": 0.01268397251359881,
"acc_norm": 0.44198174706649285,
"acc_norm_stderr": 0.01268397251359881
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.02916312857067073,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.02916312857067073
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495155,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495155
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744546,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744546
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4467564259485924,
"mc1_stderr": 0.017403977522557144,
"mc2": 0.6171088183728592,
"mc2_stderr": 0.015045730588189423
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987729
},
"harness|gsm8k|5": {
"acc": 0.4761182714177407,
"acc_stderr": 0.013756765835465753
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__WizardLM-30B-Uncensored-GPTQ | ---
pretty_name: Evaluation run of TheBloke/WizardLM-30B-Uncensored-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/WizardLM-30B-Uncensored-GPTQ](https://huggingface.co/TheBloke/WizardLM-30B-Uncensored-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__WizardLM-30B-Uncensored-GPTQ_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-07T17:24:26.800307](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-30B-Uncensored-GPTQ_public/blob/main/results_2023-11-07T17-24-26.800307.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.11220637583892618,\n\
\ \"em_stderr\": 0.003232246172292982,\n \"f1\": 0.19735633389261756,\n\
\ \"f1_stderr\": 0.0034729011607307052,\n \"acc\": 0.47120764875928467,\n\
\ \"acc_stderr\": 0.01184381041429583\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.11220637583892618,\n \"em_stderr\": 0.003232246172292982,\n\
\ \"f1\": 0.19735633389261756,\n \"f1_stderr\": 0.0034729011607307052\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.21076573161485973,\n \
\ \"acc_stderr\": 0.011234280469030465\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7316495659037096,\n \"acc_stderr\": 0.012453340359561195\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/WizardLM-30B-Uncensored-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_07T17_24_26.800307
path:
- '**/details_harness|drop|3_2023-11-07T17-24-26.800307.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-07T17-24-26.800307.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_07T17_24_26.800307
path:
- '**/details_harness|gsm8k|5_2023-11-07T17-24-26.800307.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-07T17-24-26.800307.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_07T17_24_26.800307
path:
- '**/details_harness|winogrande|5_2023-11-07T17-24-26.800307.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-07T17-24-26.800307.parquet'
- config_name: results
data_files:
- split: 2023_11_07T17_24_26.800307
path:
- results_2023-11-07T17-24-26.800307.parquet
- split: latest
path:
- results_2023-11-07T17-24-26.800307.parquet
---
# Dataset Card for Evaluation run of TheBloke/WizardLM-30B-Uncensored-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/WizardLM-30B-Uncensored-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/WizardLM-30B-Uncensored-GPTQ](https://huggingface.co/TheBloke/WizardLM-30B-Uncensored-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__WizardLM-30B-Uncensored-GPTQ_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-07T17:24:26.800307](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-30B-Uncensored-GPTQ_public/blob/main/results_2023-11-07T17-24-26.800307.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.11220637583892618,
"em_stderr": 0.003232246172292982,
"f1": 0.19735633389261756,
"f1_stderr": 0.0034729011607307052,
"acc": 0.47120764875928467,
"acc_stderr": 0.01184381041429583
},
"harness|drop|3": {
"em": 0.11220637583892618,
"em_stderr": 0.003232246172292982,
"f1": 0.19735633389261756,
"f1_stderr": 0.0034729011607307052
},
"harness|gsm8k|5": {
"acc": 0.21076573161485973,
"acc_stderr": 0.011234280469030465
},
"harness|winogrande|5": {
"acc": 0.7316495659037096,
"acc_stderr": 0.012453340359561195
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ad6398/Deepmind-CodeContest-Unrolled | ---
dataset_info:
features:
- name: name
dtype: string
- name: description
dtype: string
- name: public_tests
struct:
- name: input
sequence: string
- name: output
sequence: string
- name: private_tests
struct:
- name: input
sequence: string
- name: output
sequence: string
- name: solution_type
dtype: string
- name: programming_language
dtype: string
- name: solution
dtype: string
splits:
- name: train
num_bytes: 243325073835
num_examples: 13086199
- name: test
num_bytes: 1002995714
num_examples: 65753
- name: valid
num_bytes: 2650695693
num_examples: 58488
download_size: 37389624916
dataset_size: 246978765242
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
open-llm-leaderboard/details_allknowingroger__FrankenRoger-10B-passthrough | ---
pretty_name: Evaluation run of allknowingroger/FrankenRoger-10B-passthrough
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [allknowingroger/FrankenRoger-10B-passthrough](https://huggingface.co/allknowingroger/FrankenRoger-10B-passthrough)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_allknowingroger__FrankenRoger-10B-passthrough\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-11T06:54:52.265631](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__FrankenRoger-10B-passthrough/blob/main/results_2024-04-11T06-54-52.265631.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6420559610555521,\n\
\ \"acc_stderr\": 0.03232145346181343,\n \"acc_norm\": 0.6445276882066361,\n\
\ \"acc_norm_stderr\": 0.03297808683614257,\n \"mc1\": 0.5924112607099143,\n\
\ \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.7384132119906791,\n\
\ \"mc2_stderr\": 0.01454582251657146\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6834470989761092,\n \"acc_stderr\": 0.013592431519068079,\n\
\ \"acc_norm\": 0.7167235494880546,\n \"acc_norm_stderr\": 0.013167478735134575\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7154949213304123,\n\
\ \"acc_stderr\": 0.004502563079349392,\n \"acc_norm\": 0.8862776339374626,\n\
\ \"acc_norm_stderr\": 0.0031682493518893013\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901409,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901409\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.028985455652334388,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.028985455652334388\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382186,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382186\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"\
acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959217,\n \"\
acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959217\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276876,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276876\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059274,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059274\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659807,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659807\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.015990154885073368,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.015990154885073368\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.03362277436608043,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03362277436608043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671632,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671632\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8270042194092827,\n \"acc_stderr\": 0.024621562866768424,\n \
\ \"acc_norm\": 0.8270042194092827,\n \"acc_norm_stderr\": 0.024621562866768424\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n\
\ \"acc_stderr\": 0.030360379710291954,\n \"acc_norm\": 0.7130044843049327,\n\
\ \"acc_norm_stderr\": 0.030360379710291954\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"\
acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903335,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903335\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.02519018132760841,\n\
\ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.02519018132760841\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37988826815642457,\n\
\ \"acc_stderr\": 0.016232826818678495,\n \"acc_norm\": 0.37988826815642457,\n\
\ \"acc_norm_stderr\": 0.016232826818678495\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.02633661346904664,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.02633661346904664\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153273,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153273\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135107,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135107\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4908735332464146,\n\
\ \"acc_stderr\": 0.012768108601640007,\n \"acc_norm\": 0.4908735332464146,\n\
\ \"acc_norm_stderr\": 0.012768108601640007\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.696078431372549,\n \"acc_stderr\": 0.01860755213127983,\n \
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.01860755213127983\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482705,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482705\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5924112607099143,\n\
\ \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.7384132119906791,\n\
\ \"mc2_stderr\": 0.01454582251657146\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.01037045555134334\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5049279757391963,\n \
\ \"acc_stderr\": 0.013771815775470578\n }\n}\n```"
repo_url: https://huggingface.co/allknowingroger/FrankenRoger-10B-passthrough
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|arc:challenge|25_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|gsm8k|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hellaswag|10_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T06-54-52.265631.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-11T06-54-52.265631.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- '**/details_harness|winogrande|5_2024-04-11T06-54-52.265631.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-11T06-54-52.265631.parquet'
- config_name: results
data_files:
- split: 2024_04_11T06_54_52.265631
path:
- results_2024-04-11T06-54-52.265631.parquet
- split: latest
path:
- results_2024-04-11T06-54-52.265631.parquet
---
# Dataset Card for Evaluation run of allknowingroger/FrankenRoger-10B-passthrough
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [allknowingroger/FrankenRoger-10B-passthrough](https://huggingface.co/allknowingroger/FrankenRoger-10B-passthrough) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_allknowingroger__FrankenRoger-10B-passthrough",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-11T06:54:52.265631](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__FrankenRoger-10B-passthrough/blob/main/results_2024-04-11T06-54-52.265631.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6420559610555521,
"acc_stderr": 0.03232145346181343,
"acc_norm": 0.6445276882066361,
"acc_norm_stderr": 0.03297808683614257,
"mc1": 0.5924112607099143,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.7384132119906791,
"mc2_stderr": 0.01454582251657146
},
"harness|arc:challenge|25": {
"acc": 0.6834470989761092,
"acc_stderr": 0.013592431519068079,
"acc_norm": 0.7167235494880546,
"acc_norm_stderr": 0.013167478735134575
},
"harness|hellaswag|10": {
"acc": 0.7154949213304123,
"acc_stderr": 0.004502563079349392,
"acc_norm": 0.8862776339374626,
"acc_norm_stderr": 0.0031682493518893013
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901409,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901409
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.028985455652334388,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.028985455652334388
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382186,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382186
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932022,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932022
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276876,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276876
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059274,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059274
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659807,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659807
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.015990154885073368,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.015990154885073368
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671632,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671632
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8270042194092827,
"acc_stderr": 0.024621562866768424,
"acc_norm": 0.8270042194092827,
"acc_norm_stderr": 0.024621562866768424
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291954,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903335,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903335
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.02519018132760841,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.02519018132760841
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37988826815642457,
"acc_stderr": 0.016232826818678495,
"acc_norm": 0.37988826815642457,
"acc_norm_stderr": 0.016232826818678495
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.02633661346904664,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.02633661346904664
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153273,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153273
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135107,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4908735332464146,
"acc_stderr": 0.012768108601640007,
"acc_norm": 0.4908735332464146,
"acc_norm_stderr": 0.012768108601640007
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.01860755213127983,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.01860755213127983
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482705,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482705
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5924112607099143,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.7384132119906791,
"mc2_stderr": 0.01454582251657146
},
"harness|winogrande|5": {
"acc": 0.8374112075769534,
"acc_stderr": 0.01037045555134334
},
"harness|gsm8k|5": {
"acc": 0.5049279757391963,
"acc_stderr": 0.013771815775470578
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/02_darlinginthefranxx | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of 02/ゼロツー/zerotwo (Darling in the FranXX)
This is the dataset of 02/ゼロツー/zerotwo (Darling in the FranXX), containing 437 images and their tags.
The core tags of this character are `long_hair, pink_hair, horns, hairband, white_hairband, green_eyes, red_horns`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 437 | 298.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/02_darlinginthefranxx/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 437 | 298.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/02_darlinginthefranxx/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 858 | 521.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/02_darlinginthefranxx/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/02_darlinginthefranxx',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, anime_coloring, aqua_eyes, eyeshadow, solo, portrait, straight_hair, blurry, closed_mouth, looking_at_viewer, parted_lips, smile, uniform |
| 1 | 7 |  |  |  |  |  | 1girl, blood_on_face, pilot_suit, solo, red_bodysuit, upper_body, aqua_eyes, cockpit, looking_at_viewer, closed_mouth, science_fiction |
| 2 | 10 |  |  |  |  |  | 1girl, medium_breasts, pilot_suit, solo, standing, aqua_eyes, red_bodysuit, skin_tight, straight_hair, closed_mouth, covered_navel, looking_at_viewer, science_fiction, eyeshadow, hand_on_own_hip, open_mouth |
| 3 | 11 |  |  |  |  |  | 1girl, military_uniform, orange_necktie, solo, upper_body, eyeshadow, straight_hair, anime_coloring, aqua_eyes, smile, short_necktie, closed_mouth, window |
| 4 | 5 |  |  |  |  |  | 1girl, anime_coloring, military_uniform, orange_necktie, solo, upper_body, smile, looking_at_viewer, open_mouth, short_necktie |
| 5 | 12 |  |  |  |  |  | 1girl, military_uniform, orange_necktie, solo, short_necktie, straight_hair, aqua_eyes, shaded_face, upper_body, closed_mouth |
| 6 | 12 |  |  |  |  |  | 1boy, 1girl, black_hair, military_uniform, orange_necktie, couple, hetero, looking_at_another, smile, makeup |
| 7 | 10 |  |  |  |  |  | 1girl, military_uniform, solo, cherry_blossoms, hat, tree, petals, upper_body, closed_mouth, open_mouth, straight_hair, :d, long_sleeves, makeup, outdoors |
| 8 | 7 |  |  |  |  |  | 1girl, solo, water, partially_submerged, white_one-piece_swimsuit, :d, collarbone, open_mouth, upper_body, breasts, ponytail, sidelocks, anime_coloring |
| 9 | 7 |  |  |  |  |  | 1girl, solo, completely_nude, looking_at_viewer, outdoors, tree, upper_body, hair_censor, hair_over_breasts, smile, closed_mouth, collarbone, forest, medium_breasts, straight_hair |
| 10 | 7 |  |  |  |  |  | 1girl, closed_mouth, sitting, sleeveless_dress, solo, white_dress, indoors, straight_hair, smile, bare_shoulders, barefoot, bed |
| 11 | 20 |  |  |  |  |  | blazer, grey_jacket, pleated_skirt, red_scarf, school_uniform, long_sleeves, white_shirt, striped_necktie, plaid_skirt, miniskirt, open_jacket, black_background, 1girl, 2girls, kneehighs, shoes, solo_focus |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | anime_coloring | aqua_eyes | eyeshadow | solo | portrait | straight_hair | blurry | closed_mouth | looking_at_viewer | parted_lips | smile | uniform | blood_on_face | pilot_suit | red_bodysuit | upper_body | cockpit | science_fiction | medium_breasts | standing | skin_tight | covered_navel | hand_on_own_hip | open_mouth | military_uniform | orange_necktie | short_necktie | window | shaded_face | 1boy | black_hair | couple | hetero | looking_at_another | makeup | cherry_blossoms | hat | tree | petals | :d | long_sleeves | outdoors | water | partially_submerged | white_one-piece_swimsuit | collarbone | breasts | ponytail | sidelocks | completely_nude | hair_censor | hair_over_breasts | forest | sitting | sleeveless_dress | white_dress | indoors | bare_shoulders | barefoot | bed | blazer | grey_jacket | pleated_skirt | red_scarf | school_uniform | white_shirt | striped_necktie | plaid_skirt | miniskirt | open_jacket | black_background | 2girls | kneehighs | shoes | solo_focus |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------------|:------------|:------------|:-------|:-----------|:----------------|:---------|:---------------|:--------------------|:--------------|:--------|:----------|:----------------|:-------------|:---------------|:-------------|:----------|:------------------|:-----------------|:-----------|:-------------|:----------------|:------------------|:-------------|:-------------------|:-----------------|:----------------|:---------|:--------------|:-------|:-------------|:---------|:---------|:---------------------|:---------|:------------------|:------|:-------|:---------|:-----|:---------------|:-----------|:--------|:----------------------|:---------------------------|:-------------|:----------|:-----------|:------------|:------------------|:--------------|:--------------------|:---------|:----------|:-------------------|:--------------|:----------|:-----------------|:-----------|:------|:---------|:--------------|:----------------|:------------|:-----------------|:--------------|:------------------|:--------------|:------------|:--------------|:-------------------|:---------|:------------|:--------|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | | X | | X | | | | X | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | | X | X | X | | X | | X | X | | | | | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | X | X | X | | X | | X | | | X | | | | | X | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | | | X | | | | | X | | X | | | | | X | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 12 |  |  |  |  |  | X | | X | | X | | X | | X | | | | | | | | X | | | | | | | | | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 12 |  |  |  |  |  | X | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 10 |  |  |  |  |  | X | | | | X | | X | | X | | | | | | | | X | | | | | | | | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | X | | | X | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | | | | X | | X | | X | X | | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | X | | | | X | | | | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 7 |  |  |  |  |  | X | | | | X | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 11 | 20 |  |  |  |  |  | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
heliosprime/twitter_dataset_1713002097 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 8512
num_examples: 19
download_size: 8903
dataset_size: 8512
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713002097"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
asselL/isopodpapers | ---
license: cc
task_categories:
- summarization
language:
- en
- de
pretty_name: Data extraction from isopod papers
--- |
distilled-from-one-sec-cv12/chunk_137 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1171168588
num_examples: 228209
download_size: 1197717186
dataset_size: 1171168588
---
# Dataset Card for "chunk_137"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alisson40889/loka | ---
license: openrail
---
|
LRGB/coco_superpixels_edge_wt_region_boundary_10 | ---
task_categories:
- graph-ml
size_categories:
- 1M<n<10M
tags:
- lrgb
license: cc-by-4.0
dataset_info:
features:
- name: x
dtype: int64
- name: edge_index
dtype: int64
- name: edge_attr
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 3625184
num_examples: 113287
- name: val
num_bytes: 160032
num_examples: 5001
- name: test
num_bytes: 160032
num_examples: 5001
download_size: 3252505
dataset_size: 3945248
---
# `coco_superpixels_edge_wt_region_boundary_10`
### Dataset Summary
| Dataset | Domain | Task | Node Feat. (dim) | Edge Feat. (dim) | Perf. Metric |
|---|---|---|---|---|---|
| COCO-SP | Computer Vision | Node Prediction | Pixel + Coord (14) | Edge Weight (1 or 2) | macro F1 |
| Dataset | # Graphs | # Nodes | μ Nodes | μ Deg. | # Edges | μ Edges | μ Short. Path | μ Diameter
|---|---:|---:|---:|:---:|---:|---:|---:|---:|
| COCO-SP | 123,286 | 58,793,216 | 476.88 | 5.65 | 332,091,902 | 2,693.67 | 10.66±0.55 | 27.39±2.14 |
## Additional Information
### Dataset Curators
* Vijay Prakash Dwivedi ([vijaydwivedi75](https://github.com/vijaydwivedi75))
### Citation Information
```
@article{dwivedi2022LRGB,
title={Long Range Graph Benchmark},
author={Dwivedi, Vijay Prakash and Rampášek, Ladislav and Galkin, Mikhail and Parviz, Ali and Wolf, Guy and Luu, Anh Tuan and Beaini, Dominique},
journal={arXiv:2206.08164},
year={2022}
}
``` |
M9DX/balancedVizData | ---
license: mit
---
|
open-llm-leaderboard/details_SanjiWatsuki__Loyal-Toppy-Bruins-Maid-7B-DARE | ---
pretty_name: Evaluation run of SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE](https://huggingface.co/SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SanjiWatsuki__Loyal-Toppy-Bruins-Maid-7B-DARE\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-23T17:05:23.693649](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__Loyal-Toppy-Bruins-Maid-7B-DARE/blob/main/results_2023-12-23T17-05-23.693649.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6520506430081576,\n\
\ \"acc_stderr\": 0.031997647808783045,\n \"acc_norm\": 0.653156295534757,\n\
\ \"acc_norm_stderr\": 0.032642046886080175,\n \"mc1\": 0.4357405140758874,\n\
\ \"mc1_stderr\": 0.017358345398863124,\n \"mc2\": 0.6126404146665745,\n\
\ \"mc2_stderr\": 0.01563487272923927\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6544368600682594,\n \"acc_stderr\": 0.013896938461145683,\n\
\ \"acc_norm\": 0.6868600682593856,\n \"acc_norm_stderr\": 0.013552671543623492\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6833300139414459,\n\
\ \"acc_stderr\": 0.0046422680794889395,\n \"acc_norm\": 0.8603863772156941,\n\
\ \"acc_norm_stderr\": 0.0034587739347195527\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438665,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438665\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"\
acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\"\
: 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.023507579020645365,\n\
\ \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.023507579020645365\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634286,\n \
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634286\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8568807339449541,\n \"acc_stderr\": 0.01501446249716859,\n \"\
acc_norm\": 0.8568807339449541,\n \"acc_norm_stderr\": 0.01501446249716859\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.038498560987940876,\n \"\
acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940876\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608306,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608306\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500097,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500097\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46033519553072627,\n\
\ \"acc_stderr\": 0.016669799592112025,\n \"acc_norm\": 0.46033519553072627,\n\
\ \"acc_norm_stderr\": 0.016669799592112025\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460842,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460842\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4595827900912647,\n\
\ \"acc_stderr\": 0.012728446067669957,\n \"acc_norm\": 0.4595827900912647,\n\
\ \"acc_norm_stderr\": 0.012728446067669957\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146294,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146294\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.01899970738316267,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.01899970738316267\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4357405140758874,\n\
\ \"mc1_stderr\": 0.017358345398863124,\n \"mc2\": 0.6126404146665745,\n\
\ \"mc2_stderr\": 0.01563487272923927\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597221\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6527672479150872,\n \
\ \"acc_stderr\": 0.013113898382146877\n }\n}\n```"
repo_url: https://huggingface.co/SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|arc:challenge|25_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|arc:challenge|25_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|gsm8k|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|gsm8k|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hellaswag|10_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hellaswag|10_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-33-11.430841.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T17-05-23.693649.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T17-05-23.693649.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- '**/details_harness|winogrande|5_2023-12-23T16-33-11.430841.parquet'
- split: 2023_12_23T17_05_23.693649
path:
- '**/details_harness|winogrande|5_2023-12-23T17-05-23.693649.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-23T17-05-23.693649.parquet'
- config_name: results
data_files:
- split: 2023_12_23T16_33_11.430841
path:
- results_2023-12-23T16-33-11.430841.parquet
- split: 2023_12_23T17_05_23.693649
path:
- results_2023-12-23T17-05-23.693649.parquet
- split: latest
path:
- results_2023-12-23T17-05-23.693649.parquet
---
# Dataset Card for Evaluation run of SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE](https://huggingface.co/SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SanjiWatsuki__Loyal-Toppy-Bruins-Maid-7B-DARE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T17:05:23.693649](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__Loyal-Toppy-Bruins-Maid-7B-DARE/blob/main/results_2023-12-23T17-05-23.693649.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6520506430081576,
"acc_stderr": 0.031997647808783045,
"acc_norm": 0.653156295534757,
"acc_norm_stderr": 0.032642046886080175,
"mc1": 0.4357405140758874,
"mc1_stderr": 0.017358345398863124,
"mc2": 0.6126404146665745,
"mc2_stderr": 0.01563487272923927
},
"harness|arc:challenge|25": {
"acc": 0.6544368600682594,
"acc_stderr": 0.013896938461145683,
"acc_norm": 0.6868600682593856,
"acc_norm_stderr": 0.013552671543623492
},
"harness|hellaswag|10": {
"acc": 0.6833300139414459,
"acc_stderr": 0.0046422680794889395,
"acc_norm": 0.8603863772156941,
"acc_norm_stderr": 0.0034587739347195527
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438665,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438665
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6871794871794872,
"acc_stderr": 0.023507579020645365,
"acc_norm": 0.6871794871794872,
"acc_norm_stderr": 0.023507579020645365
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634286,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634286
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8568807339449541,
"acc_stderr": 0.01501446249716859,
"acc_norm": 0.8568807339449541,
"acc_norm_stderr": 0.01501446249716859
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940876,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940876
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608306,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500097,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500097
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46033519553072627,
"acc_stderr": 0.016669799592112025,
"acc_norm": 0.46033519553072627,
"acc_norm_stderr": 0.016669799592112025
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460842,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4595827900912647,
"acc_stderr": 0.012728446067669957,
"acc_norm": 0.4595827900912647,
"acc_norm_stderr": 0.012728446067669957
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146294,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146294
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.01899970738316267,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.01899970738316267
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291296,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4357405140758874,
"mc1_stderr": 0.017358345398863124,
"mc2": 0.6126404146665745,
"mc2_stderr": 0.01563487272923927
},
"harness|winogrande|5": {
"acc": 0.7955801104972375,
"acc_stderr": 0.011334090612597221
},
"harness|gsm8k|5": {
"acc": 0.6527672479150872,
"acc_stderr": 0.013113898382146877
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_bigcode__starcoder | ---
pretty_name: Evaluation run of bigcode/starcoder
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bigcode/starcoder](https://huggingface.co/bigcode/starcoder) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 121 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the aggregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigcode__starcoder\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-14T22:50:56.838467](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoder/blob/main/results_2024-02-14T22-50-56.838467.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2969189890806991,\n\
\ \"acc_stderr\": 0.03236365511067932,\n \"acc_norm\": 0.2979650690177265,\n\
\ \"acc_norm_stderr\": 0.033097159757475146,\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707689,\n \"mc2\": 0.4130412207453783,\n\
\ \"mc2_stderr\": 0.014976467041499917\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.28071672354948807,\n \"acc_stderr\": 0.013131238126975574,\n\
\ \"acc_norm\": 0.302901023890785,\n \"acc_norm_stderr\": 0.013428241573185349\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.37860983867755427,\n\
\ \"acc_stderr\": 0.004840493603166207,\n \"acc_norm\": 0.4787890858394742,\n\
\ \"acc_norm_stderr\": 0.004985289555586536\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3037037037037037,\n\
\ \"acc_stderr\": 0.039725528847851375,\n \"acc_norm\": 0.3037037037037037,\n\
\ \"acc_norm_stderr\": 0.039725528847851375\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.036906779861372814,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.036906779861372814\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.36,\n\
\ \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.25660377358490566,\n \"acc_stderr\": 0.02688064788905197,\n\
\ \"acc_norm\": 0.25660377358490566,\n \"acc_norm_stderr\": 0.02688064788905197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2986111111111111,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.2986111111111111,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.032147373020294696,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.032147373020294696\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3021276595744681,\n \"acc_stderr\": 0.030017554471880554,\n\
\ \"acc_norm\": 0.3021276595744681,\n \"acc_norm_stderr\": 0.030017554471880554\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.041307408795554966,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.041307408795554966\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2328042328042328,\n \"acc_stderr\": 0.02176596167215453,\n \"\
acc_norm\": 0.2328042328042328,\n \"acc_norm_stderr\": 0.02176596167215453\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.24193548387096775,\n \"acc_stderr\": 0.024362599693031076,\n \"\
acc_norm\": 0.24193548387096775,\n \"acc_norm_stderr\": 0.024362599693031076\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.21674876847290642,\n \"acc_stderr\": 0.028990331252516235,\n \"\
acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.028990331252516235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3696969696969697,\n \"acc_stderr\": 0.03769430314512568,\n\
\ \"acc_norm\": 0.3696969696969697,\n \"acc_norm_stderr\": 0.03769430314512568\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.19696969696969696,\n \"acc_stderr\": 0.02833560973246335,\n \"\
acc_norm\": 0.19696969696969696,\n \"acc_norm_stderr\": 0.02833560973246335\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.030975436386845426,\n\
\ \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.030975436386845426\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24615384615384617,\n \"acc_stderr\": 0.021840866990423088,\n\
\ \"acc_norm\": 0.24615384615384617,\n \"acc_norm_stderr\": 0.021840866990423088\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341937,\n\
\ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341937\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.033742355504256936,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.033742355504256936\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21284403669724772,\n \"acc_stderr\": 0.01754937638931369,\n \"\
acc_norm\": 0.21284403669724772,\n \"acc_norm_stderr\": 0.01754937638931369\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.17592592592592593,\n \"acc_stderr\": 0.025967420958258533,\n \"\
acc_norm\": 0.17592592592592593,\n \"acc_norm_stderr\": 0.025967420958258533\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693268,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693268\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3080168776371308,\n \"acc_stderr\": 0.0300523893356057,\n \
\ \"acc_norm\": 0.3080168776371308,\n \"acc_norm_stderr\": 0.0300523893356057\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34977578475336324,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.34977578475336324,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3053435114503817,\n \"acc_stderr\": 0.040393149787245605,\n\
\ \"acc_norm\": 0.3053435114503817,\n \"acc_norm_stderr\": 0.040393149787245605\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.39669421487603307,\n \"acc_stderr\": 0.044658697805310094,\n \"\
acc_norm\": 0.39669421487603307,\n \"acc_norm_stderr\": 0.044658697805310094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.03351953879521269,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.03351953879521269\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4017094017094017,\n\
\ \"acc_stderr\": 0.03211693751051622,\n \"acc_norm\": 0.4017094017094017,\n\
\ \"acc_norm_stderr\": 0.03211693751051622\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3052362707535121,\n\
\ \"acc_stderr\": 0.016467711947635112,\n \"acc_norm\": 0.3052362707535121,\n\
\ \"acc_norm_stderr\": 0.016467711947635112\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.36127167630057805,\n \"acc_stderr\": 0.025862201852277895,\n\
\ \"acc_norm\": 0.36127167630057805,\n \"acc_norm_stderr\": 0.025862201852277895\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3006535947712418,\n \"acc_stderr\": 0.026256053835718968,\n\
\ \"acc_norm\": 0.3006535947712418,\n \"acc_norm_stderr\": 0.026256053835718968\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.33762057877813506,\n\
\ \"acc_stderr\": 0.026858825879488547,\n \"acc_norm\": 0.33762057877813506,\n\
\ \"acc_norm_stderr\": 0.026858825879488547\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.31790123456790126,\n \"acc_stderr\": 0.02591006352824088,\n\
\ \"acc_norm\": 0.31790123456790126,\n \"acc_norm_stderr\": 0.02591006352824088\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590624,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590624\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2757496740547588,\n\
\ \"acc_stderr\": 0.011413813609161,\n \"acc_norm\": 0.2757496740547588,\n\
\ \"acc_norm_stderr\": 0.011413813609161\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.02456220431414232,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.02456220431414232\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3088235294117647,\n \"acc_stderr\": 0.018690850273595273,\n \
\ \"acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.018690850273595273\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.3181818181818182,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.02768297952296023,\n\
\ \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.02768297952296023\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.34328358208955223,\n\
\ \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.34328358208955223,\n\
\ \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.03629335329947861,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.03629335329947861\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707689,\n \"mc2\": 0.4130412207453783,\n\
\ \"mc2_stderr\": 0.014976467041499917\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5627466456195738,\n \"acc_stderr\": 0.013941393310695917\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09173616376042457,\n \
\ \"acc_stderr\": 0.007950942148339347\n }\n}\n```"
repo_url: https://huggingface.co/bigcode/starcoder
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|arc:challenge|25_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|arc:challenge|25_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|gsm8k|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hellaswag|10_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hellaswag|10_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T09:53:59.312863.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T22-50-56.838467.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T09:53:59.312863.parquet'
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T22-50-56.838467.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_14T22_50_56.838467
path:
- '**/details_harness|winogrande|5_2024-02-14T22-50-56.838467.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-14T22-50-56.838467.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T21:17:20.453695.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T21:18:29.614335.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:management|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:management|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:virology|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:virology|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T21:18:29.614335.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T21_17_20.453695
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T21:17:20.453695.parquet'
- split: 2023_08_28T21_18_29.614335
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T21:18:29.614335.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T21:18:29.614335.parquet'
- config_name: results
data_files:
- split: 2023_08_28T09_53_59.312863
path:
- results_2023-08-28T09:53:59.312863.parquet
- split: 2023_08_28T21_17_20.453695
path:
- results_2023-08-28T21:17:20.453695.parquet
- split: 2023_08_28T21_18_29.614335
path:
- results_2023-08-28T21:18:29.614335.parquet
- split: 2024_02_14T22_50_56.838467
path:
- results_2024-02-14T22-50-56.838467.parquet
- split: latest
path:
- results_2024-02-14T22-50-56.838467.parquet
---
# Dataset Card for Evaluation run of bigcode/starcoder
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bigcode/starcoder](https://huggingface.co/bigcode/starcoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 121 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigcode__starcoder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T22:50:56.838467](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoder/blob/main/results_2024-02-14T22-50-56.838467.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2969189890806991,
"acc_stderr": 0.03236365511067932,
"acc_norm": 0.2979650690177265,
"acc_norm_stderr": 0.033097159757475146,
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707689,
"mc2": 0.4130412207453783,
"mc2_stderr": 0.014976467041499917
},
"harness|arc:challenge|25": {
"acc": 0.28071672354948807,
"acc_stderr": 0.013131238126975574,
"acc_norm": 0.302901023890785,
"acc_norm_stderr": 0.013428241573185349
},
"harness|hellaswag|10": {
"acc": 0.37860983867755427,
"acc_stderr": 0.004840493603166207,
"acc_norm": 0.4787890858394742,
"acc_norm_stderr": 0.004985289555586536
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.039725528847851375,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.039725528847851375
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.036906779861372814,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.036906779861372814
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.25660377358490566,
"acc_stderr": 0.02688064788905197,
"acc_norm": 0.25660377358490566,
"acc_norm_stderr": 0.02688064788905197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2986111111111111,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.2986111111111111,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.032147373020294696,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.032147373020294696
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3021276595744681,
"acc_stderr": 0.030017554471880554,
"acc_norm": 0.3021276595744681,
"acc_norm_stderr": 0.030017554471880554
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.041307408795554966,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.041307408795554966
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2328042328042328,
"acc_stderr": 0.02176596167215453,
"acc_norm": 0.2328042328042328,
"acc_norm_stderr": 0.02176596167215453
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24193548387096775,
"acc_stderr": 0.024362599693031076,
"acc_norm": 0.24193548387096775,
"acc_norm_stderr": 0.024362599693031076
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.028990331252516235,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.028990331252516235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3696969696969697,
"acc_stderr": 0.03769430314512568,
"acc_norm": 0.3696969696969697,
"acc_norm_stderr": 0.03769430314512568
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.19696969696969696,
"acc_stderr": 0.02833560973246335,
"acc_norm": 0.19696969696969696,
"acc_norm_stderr": 0.02833560973246335
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24352331606217617,
"acc_stderr": 0.030975436386845426,
"acc_norm": 0.24352331606217617,
"acc_norm_stderr": 0.030975436386845426
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24615384615384617,
"acc_stderr": 0.021840866990423088,
"acc_norm": 0.24615384615384617,
"acc_norm_stderr": 0.021840866990423088
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2605042016806723,
"acc_stderr": 0.028510251512341937,
"acc_norm": 0.2605042016806723,
"acc_norm_stderr": 0.028510251512341937
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.033742355504256936,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.033742355504256936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21284403669724772,
"acc_stderr": 0.01754937638931369,
"acc_norm": 0.21284403669724772,
"acc_norm_stderr": 0.01754937638931369
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.17592592592592593,
"acc_stderr": 0.025967420958258533,
"acc_norm": 0.17592592592592593,
"acc_norm_stderr": 0.025967420958258533
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693268,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693268
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3080168776371308,
"acc_stderr": 0.0300523893356057,
"acc_norm": 0.3080168776371308,
"acc_norm_stderr": 0.0300523893356057
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34977578475336324,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.34977578475336324,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3053435114503817,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.3053435114503817,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.39669421487603307,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.39669421487603307,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.03351953879521269,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.03351953879521269
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.4017094017094017,
"acc_stderr": 0.03211693751051622,
"acc_norm": 0.4017094017094017,
"acc_norm_stderr": 0.03211693751051622
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3052362707535121,
"acc_stderr": 0.016467711947635112,
"acc_norm": 0.3052362707535121,
"acc_norm_stderr": 0.016467711947635112
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.36127167630057805,
"acc_stderr": 0.025862201852277895,
"acc_norm": 0.36127167630057805,
"acc_norm_stderr": 0.025862201852277895
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767864,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767864
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3006535947712418,
"acc_stderr": 0.026256053835718968,
"acc_norm": 0.3006535947712418,
"acc_norm_stderr": 0.026256053835718968
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.33762057877813506,
"acc_stderr": 0.026858825879488547,
"acc_norm": 0.33762057877813506,
"acc_norm_stderr": 0.026858825879488547
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.31790123456790126,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.31790123456790126,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590624,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590624
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2757496740547588,
"acc_stderr": 0.011413813609161,
"acc_norm": 0.2757496740547588,
"acc_norm_stderr": 0.011413813609161
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.02456220431414232,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.02456220431414232
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.018690850273595273,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.018690850273595273
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24897959183673468,
"acc_stderr": 0.02768297952296023,
"acc_norm": 0.24897959183673468,
"acc_norm_stderr": 0.02768297952296023
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.34328358208955223,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.34328358208955223,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.03629335329947861,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.03629335329947861
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707689,
"mc2": 0.4130412207453783,
"mc2_stderr": 0.014976467041499917
},
"harness|winogrande|5": {
"acc": 0.5627466456195738,
"acc_stderr": 0.013941393310695917
},
"harness|gsm8k|5": {
"acc": 0.09173616376042457,
"acc_stderr": 0.007950942148339347
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
statezeropy/cosmetics_finetune | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 555357
num_examples: 500
download_size: 217720
dataset_size: 555357
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
goodcoffee/covidQA_eval | ---
dataset_info:
features:
- name: input_ids
sequence: int64
- name: attention_mask
sequence: int64
- name: answer
dtype: string
- name: start_positions
dtype: int64
- name: end_positions
dtype: int64
splits:
- name: train
num_bytes: 414807
num_examples: 50
download_size: 50631
dataset_size: 414807
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "covidQA_eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
agie-ai/awesome-chatgpt-prompts | ---
dataset_info:
features:
- name: act
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 74581
num_examples: 153
download_size: 45077
dataset_size: 74581
---
# Dataset Card for "awesome-chatgpt-prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Q-b1t/Dogs_Emotions_Dataset | ---
license: mit
---
# Dog Emotions Dataset
This is a dataset of images of dogs with happy and sad faces; as simple as that. Use it train vision classifiers for happy and sad dogs. It comes already split into training and test datasets. Moreover, the labels can be infered from the file structure:
```
dog_emotions_dataset/
└── images/
├── train/
│ ├── happy/
│ │ ├── ed4QZAil6U779pL3ZndRNLvqxF2gMU890.jpg
│ │ ├── r5J1n5FFdTDAokesz72rKJQRJq3Ktn42.jpg
│ │ ├── efuF5XwayrlqgUVIXtDAkDHKJce4xG629.jpg
│ │ ├── rAawLrHoK1Cjvn2Os5jpM6uIZPNLMe114.jpg
│ │ ├── eghaZlxykdiy5GEaNnmZvdoc39QFXf35.jpg
│ │ └── ...
│ └── sad/
└── test/
├── happy/
└── sad/
``` |
tyzhu/squad_context_train_10_eval_10 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 351990
num_examples: 150
- name: validation
num_bytes: 101044
num_examples: 48
download_size: 101367
dataset_size: 453034
---
# Dataset Card for "squad_context_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_rte_synthetic_superlative | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 31406
num_examples: 73
- name: train
num_bytes: 25320
num_examples: 58
download_size: 48342
dataset_size: 56726
---
# Dataset Card for "MULTI_VALUE_rte_synthetic_superlative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hippocrates/GuidelineQA_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1753067
num_examples: 1999
download_size: 742686
dataset_size: 1753067
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
chemNLP/chemistry-bookshelves-merged | ---
dataset_info:
features:
- name: title
dtype: string
- name: url
dtype: string
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 56206230
num_examples: 7728
download_size: 25267751
dataset_size: 56206230
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "chemistry-bookshelves-merged"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_MRAIRR__mini_7B_dare_v1 | ---
pretty_name: Evaluation run of MRAIRR/mini_7B_dare_v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MRAIRR/mini_7B_dare_v1](https://huggingface.co/MRAIRR/mini_7B_dare_v1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MRAIRR__mini_7B_dare_v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T22:06:49.514439](https://huggingface.co/datasets/open-llm-leaderboard/details_MRAIRR__mini_7B_dare_v1/blob/main/results_2024-02-01T22-06-49.514439.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5971205134841852,\n\
\ \"acc_stderr\": 0.03305252247330764,\n \"acc_norm\": 0.5993597039453373,\n\
\ \"acc_norm_stderr\": 0.03371332021374718,\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.01699762787190793,\n \"mc2\": 0.5464175107671695,\n\
\ \"mc2_stderr\": 0.01554949662717814\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5784982935153583,\n \"acc_stderr\": 0.014430197069326023,\n\
\ \"acc_norm\": 0.6177474402730375,\n \"acc_norm_stderr\": 0.014200454049979282\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.595399322844055,\n\
\ \"acc_stderr\": 0.004898115110975035,\n \"acc_norm\": 0.7991435968930491,\n\
\ \"acc_norm_stderr\": 0.003998220753048877\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365252,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365252\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.03800968060554859,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.03800968060554859\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006716,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006716\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115979,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115979\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377561,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377561\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7193548387096774,\n \"acc_stderr\": 0.025560604721022895,\n \"\
acc_norm\": 0.7193548387096774,\n \"acc_norm_stderr\": 0.025560604721022895\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"\
acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124488,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124488\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723872,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723872\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5641025641025641,\n \"acc_stderr\": 0.02514180151117749,\n \
\ \"acc_norm\": 0.5641025641025641,\n \"acc_norm_stderr\": 0.02514180151117749\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473075,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.781651376146789,\n \"acc_stderr\": 0.017712600528722724,\n \"\
acc_norm\": 0.781651376146789,\n \"acc_norm_stderr\": 0.017712600528722724\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3425925925925926,\n \"acc_stderr\": 0.03236585252602158,\n \"\
acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.03236585252602158\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n\
\ \"acc_stderr\": 0.014743125394823298,\n \"acc_norm\": 0.7828863346104725,\n\
\ \"acc_norm_stderr\": 0.014743125394823298\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29497206703910617,\n\
\ \"acc_stderr\": 0.015251931579208181,\n \"acc_norm\": 0.29497206703910617,\n\
\ \"acc_norm_stderr\": 0.015251931579208181\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281406,\n\
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281406\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.026730620728004903,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.026730620728004903\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n\
\ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144376,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144376\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4361147327249022,\n\
\ \"acc_stderr\": 0.012665568135455328,\n \"acc_norm\": 0.4361147327249022,\n\
\ \"acc_norm_stderr\": 0.012665568135455328\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.030211479609121596,\n\
\ \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.030211479609121596\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.030116426296540603,\n\
\ \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.030116426296540603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533207,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533207\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.01699762787190793,\n \"mc2\": 0.5464175107671695,\n\
\ \"mc2_stderr\": 0.01554949662717814\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.01233483367199829\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5655799848369977,\n \
\ \"acc_stderr\": 0.013653507211411415\n }\n}\n```"
repo_url: https://huggingface.co/MRAIRR/mini_7B_dare_v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|arc:challenge|25_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|gsm8k|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hellaswag|10_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T22-06-49.514439.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T22-06-49.514439.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- '**/details_harness|winogrande|5_2024-02-01T22-06-49.514439.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T22-06-49.514439.parquet'
- config_name: results
data_files:
- split: 2024_02_01T22_06_49.514439
path:
- results_2024-02-01T22-06-49.514439.parquet
- split: latest
path:
- results_2024-02-01T22-06-49.514439.parquet
---
# Dataset Card for Evaluation run of MRAIRR/mini_7B_dare_v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MRAIRR/mini_7B_dare_v1](https://huggingface.co/MRAIRR/mini_7B_dare_v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MRAIRR__mini_7B_dare_v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T22:06:49.514439](https://huggingface.co/datasets/open-llm-leaderboard/details_MRAIRR__mini_7B_dare_v1/blob/main/results_2024-02-01T22-06-49.514439.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5971205134841852,
"acc_stderr": 0.03305252247330764,
"acc_norm": 0.5993597039453373,
"acc_norm_stderr": 0.03371332021374718,
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190793,
"mc2": 0.5464175107671695,
"mc2_stderr": 0.01554949662717814
},
"harness|arc:challenge|25": {
"acc": 0.5784982935153583,
"acc_stderr": 0.014430197069326023,
"acc_norm": 0.6177474402730375,
"acc_norm_stderr": 0.014200454049979282
},
"harness|hellaswag|10": {
"acc": 0.595399322844055,
"acc_stderr": 0.004898115110975035,
"acc_norm": 0.7991435968930491,
"acc_norm_stderr": 0.003998220753048877
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365252,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365252
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.03800968060554859,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.03800968060554859
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006716,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006716
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.02494236893115979,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.02494236893115979
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377561,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377561
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.025560604721022895,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.025560604721022895
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124488,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723872,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723872
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5641025641025641,
"acc_stderr": 0.02514180151117749,
"acc_norm": 0.5641025641025641,
"acc_norm_stderr": 0.02514180151117749
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473075,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413926,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.781651376146789,
"acc_stderr": 0.017712600528722724,
"acc_norm": 0.781651376146789,
"acc_norm_stderr": 0.017712600528722724
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.03236585252602158,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.03236585252602158
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7828863346104725,
"acc_stderr": 0.014743125394823298,
"acc_norm": 0.7828863346104725,
"acc_norm_stderr": 0.014743125394823298
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.02524826477424284,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.02524826477424284
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29497206703910617,
"acc_stderr": 0.015251931579208181,
"acc_norm": 0.29497206703910617,
"acc_norm_stderr": 0.015251931579208181
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.026857294663281406,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.026857294663281406
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.026730620728004903,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.026730620728004903
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144376,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144376
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4361147327249022,
"acc_stderr": 0.012665568135455328,
"acc_norm": 0.4361147327249022,
"acc_norm_stderr": 0.012665568135455328
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5514705882352942,
"acc_stderr": 0.030211479609121596,
"acc_norm": 0.5514705882352942,
"acc_norm_stderr": 0.030211479609121596
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.019722058939618068,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.019722058939618068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.030116426296540603,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.030116426296540603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533207,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533207
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190793,
"mc2": 0.5464175107671695,
"mc2_stderr": 0.01554949662717814
},
"harness|winogrande|5": {
"acc": 0.739542225730071,
"acc_stderr": 0.01233483367199829
},
"harness|gsm8k|5": {
"acc": 0.5655799848369977,
"acc_stderr": 0.013653507211411415
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Atipico1/popQA_preprocessed_with_short-original_case | ---
dataset_info:
features:
- name: id
dtype: int64
- name: subj
dtype: string
- name: prop
dtype: string
- name: obj
dtype: string
- name: subj_id
dtype: int64
- name: prop_id
dtype: int64
- name: obj_id
dtype: int64
- name: s_aliases
dtype: string
- name: o_aliases
dtype: string
- name: s_uri
dtype: string
- name: o_uri
dtype: string
- name: s_wiki_title
dtype: string
- name: o_wiki_title
dtype: string
- name: s_pop
dtype: int64
- name: o_pop
dtype: int64
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: score
dtype: string
- name: text
dtype: string
- name: title
dtype: string
- name: case
list:
- name: answer
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 103629092
num_examples: 10000
- name: test
num_bytes: 44200817
num_examples: 4267
download_size: 59816758
dataset_size: 147829909
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
playGenshinPlayed/ys_role_info_and_audios | ---
license: apache-2.0
---
|
annaissaeva/evenki_all_texts | ---
dataset_info:
features:
- name: env
dtype: string
- name: ru
dtype: string
- name: source
dtype: string
- name: sentence_num
dtype: int64
splits:
- name: train
num_bytes: 653661
num_examples: 2267
download_size: 311403
dataset_size: 653661
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-4.0
--- |
one-sec-cv12/chunk_234 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 21436953120.25
num_examples: 223190
download_size: 19519832702
dataset_size: 21436953120.25
---
# Dataset Card for "chunk_234"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mediocreatmybest/Miscellany_of_Australian_Historical_Photography | ---
license: cc0-1.0
---
|
MultiCoNER/multiconer_v2 | ---
license: cc-by-4.0
task_categories:
- token-classification
language:
- bn
- zh
- de
- en
- es
- fa
- fr
- hi
- it
- pt
- sv
- uk
tags:
- multiconer
- ner
- multilingual
- named entity recognition
- fine-grained ner
size_categories:
- 100K<n<1M
---
# Dataset Card for Multilingual Complex Named Entity Recognition (MultiCoNER)
## Dataset Description
- **Homepage:** https://multiconer.github.io
- **Repository:**
- **Paper:**
- **Leaderboard:** https://multiconer.github.io/results, https://codalab.lisn.upsaclay.fr/competitions/10025
- **Point of Contact:** https://multiconer.github.io/organizers
### Dataset Summary
The tagset of MultiCoNER is a fine-grained tagset.
The fine to coarse level mapping of the tags are as follows:
* Location (LOC) : Facility, OtherLOC, HumanSettlement, Station
* Creative Work (CW) : VisualWork, MusicalWork, WrittenWork, ArtWork, Software
* Group (GRP) : MusicalGRP, PublicCORP, PrivateCORP, AerospaceManufacturer, SportsGRP, CarManufacturer, ORG
* Person (PER) : Scientist, Artist, Athlete, Politician, Cleric, SportsManager, OtherPER
* Product (PROD) : Clothing, Vehicle, Food, Drink, OtherPROD
* Medical (MED) : Medication/Vaccine, MedicalProcedure, AnatomicalStructure, Symptom, Disease
### Supported Tasks and Leaderboards
The final leaderboard of the shared task is available <a href="https://multiconer.github.io/results" target="_blank">here</a>.
### Languages
Supported languages are Bangla, Chinese, English, Spanish, Farsi, French, German, Hindi, Italian, Portuguese, Swedish, Ukrainian.
## Dataset Structure
The dataset follows CoNLL format.
### Data Instances
Here are some examples in different languages:
* Bangla: [লিটল মিক্স | MusicalGrp] এ যোগদানের আগে তিনি [পিৎজা হাট | ORG] এ ওয়েট্রেস হিসাবে কাজ করেছিলেন।
* Chinese: 它的纤维穿过 [锁骨 | AnatomicalStructure] 并沿颈部侧面倾斜向上和内侧.
* English: [wes anderson | Artist]'s film [the grand budapest hotel | VisualWork] opened the festival .
* Farsi: است] ناگویا |HumanSettlement] مرکزاین استان شهر
* French: l [amiral de coligny | Politician] réussit à s y glisser .
* German: in [frühgeborenes | Disease] führt dies zu [irds | Symptom] .
* Hindi: १७९६ में उन्हें [शाही स्वीडिश विज्ञान अकादमी | Facility] का सदस्य चुना गया।
* Italian: è conservato nel [rijksmuseum | Facility] di [amsterdam | HumanSettlement] .
* Portuguese: também é utilizado para se fazer [licor | Drink] e [vinhos | Drink].
* Spanish: fue superado por el [aon center | Facility] de [los ángeles | HumanSettlement] .
* Swedish: [tom hamilton | Artist] amerikansk musiker basist i [aerosmith | MusicalGRP] .
* Ukrainian: назва альбому походить з роману « [кінець дитинства | WrittenWork] » англійського письменника [артура кларка | Artist] .
### Data Fields
The data has two fields. One is the token and another is the label. Here is an example from the English data.
```
# id f5458a3a-cd23-4df4-8384-4e23fe33a66b domain=en
doris _ _ B-Artist
day _ _ I-Artist
included _ _ O
in _ _ O
the _ _ O
album _ _ O
billy _ _ B-MusicalWork
rose _ _ I-MusicalWork
's _ _ I-MusicalWork
jumbo _ _ I-MusicalWork
```
### Data Splits
Train, Dev, and Test splits are provided
## Dataset Creation
TBD
## Loading the Dataset
```python
from datasets import load_dataset
english_data = load_dataset('MultiCoNER/multiconer_v2', 'English (EN)')
```
### Licensing Information
CC BY 4.0
### Citation Information
```
@inproceedings{multiconer2-report,
title={{SemEval-2023 Task 2: Fine-grained Multilingual Named Entity Recognition (MultiCoNER 2)}},
author={Fetahu, Besnik and Kar, Sudipta and Chen, Zhiyu and Rokhlenko, Oleg and Malmasi, Shervin},
booktitle={Proceedings of the 17th International Workshop on Semantic Evaluation (SemEval-2023)},
year={2023},
publisher={Association for Computational Linguistics},
}
@article{multiconer2-data,
title={{MultiCoNER v2: a Large Multilingual dataset for Fine-grained and Noisy Named Entity Recognition}},
author={Fetahu, Besnik and Chen, Zhiyu and Kar, Sudipta and Rokhlenko, Oleg and Malmasi, Shervin},
year={2023},
}
```
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-5000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 989990
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
senhorsapo/nezuko | ---
license: openrail
---
|
open-llm-leaderboard/details_Azure99__blossom-v3-mistral-7b | ---
pretty_name: Evaluation run of Azure99/blossom-v3-mistral-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Azure99/blossom-v3-mistral-7b](https://huggingface.co/Azure99/blossom-v3-mistral-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azure99__blossom-v3-mistral-7b\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T12:57:13.954407](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v3-mistral-7b/blob/main/results_2023-12-02T12-57-13.954407.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4670204700530705,\n\
\ \"acc_stderr\": 0.013742492794163416\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.4670204700530705,\n \"acc_stderr\": 0.013742492794163416\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Azure99/blossom-v3-mistral-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_02T12_57_13.954407
path:
- '**/details_harness|gsm8k|5_2023-12-02T12-57-13.954407.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T12-57-13.954407.parquet'
- config_name: results
data_files:
- split: 2023_12_02T12_57_13.954407
path:
- results_2023-12-02T12-57-13.954407.parquet
- split: latest
path:
- results_2023-12-02T12-57-13.954407.parquet
---
# Dataset Card for Evaluation run of Azure99/blossom-v3-mistral-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Azure99/blossom-v3-mistral-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Azure99/blossom-v3-mistral-7b](https://huggingface.co/Azure99/blossom-v3-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azure99__blossom-v3-mistral-7b",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T12:57:13.954407](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v3-mistral-7b/blob/main/results_2023-12-02T12-57-13.954407.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4670204700530705,
"acc_stderr": 0.013742492794163416
},
"harness|gsm8k|5": {
"acc": 0.4670204700530705,
"acc_stderr": 0.013742492794163416
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
rcds/MultiLegalNeg | ---
license: cc-by-nd-4.0
viewer: true
task_categories:
- token-classification
tags:
- legal
pretty_name: Multilingual Negation Scope Resolution
size_categories:
- 1K<n<10K
---
# Dataset Card for MultiLegalNeg
### Dataset Summary
This dataset consists of German, French, and Italian court documents annotated for negation cues and negation scopes. It also includes a reformated version of ConanDoyle-neg ([
Morante and Blanco. 2012](https://aclanthology.org/S12-1035/)), SFU Review ([Konstantinova et al. 2012](http://www.lrec-conf.org/proceedings/lrec2012/pdf/533_Paper.pdf)), BioScope ([Szarvas et al. 2008](https://aclanthology.org/W08-0606/)) and Dalloux ([Dalloux et al. 2020](https://clementdalloux.fr/?page_id=28)).
### Languages
| Language | Subset | Number of sentences | Negated sentences |
|----------------------|-----------------|----------------------|-------------------|
| French | **fr** | 1059 | 382 |
| Italian | **it** | 1001 | 418 |
| German(Germany) | **de(DE)** | 1068 | 1098 |
| German (Switzerland) | **de(CH)** | 206 | 208 |
| English | **SFU Review** | 17672 | 3528 |
| English | **BioScope** | 14700 | 2095 |
| English | **ConanDoyle-neg**| 5714 | 5714 |
| French | **Dalloux** | 11032 | 1817 |
## Dataset Structure
### Data Fields
- text (string): full sentence
- spans (list): list of annotated cues and scopes
- start (int): offset of the beginning of the annotation
- end (int): offset of the end of the annotation
- token_start(int): id of the first token in the annotation
- token_end(int): id of the last token in the annotation
- label (string): CUE or SCOPE
- tokens (list): list of tokens in the sentence
- text (string): token text
- start (int): offset of the first character
- end (int): offset of the last character
- id (int): token id
- ws (boolean): indicates if the token is followed by a white space
### Data Splits
For each subset a train (70%), test (20%), and validation (10%) split is available.
#### How to use this dataset
To load all data use ```'all_all'```, or specify which dataset to load as the second argument. The available configurations are
```'de', 'fr', 'it', 'swiss', 'fr_dalloux', 'fr_all', 'en_bioscope', 'en_sherlock', 'en_sfu', 'en_all', 'all_all'```
```
from datasets import load_dataset
dataset = load_dataset("rcds/MultiLegalNeg", "all_all")
dataset
```
```
DatasetDict({
train: Dataset({
features: ['text', 'spans', 'tokens'],
num_rows: 26440
})
test: Dataset({
features: ['text', 'spans', 'tokens'],
num_rows: 7593
})
validation: Dataset({
features: ['text', 'spans', 'tokens'],
num_rows: 4053
})
})
```
### Source Data
| Subset | Source |
|-------------------|----------------------|
| **fr** | [Niklaus et al. 2021](https://aclanthology.org/2021.nllp-1.3/), [Niklaus et al. 2023](https://arxiv.org/abs/2306.02069) |
| **it** | [Niklaus et al. 2021](https://aclanthology.org/2021.nllp-1.3/), [Niklaus et al. 2023](https://arxiv.org/abs/2306.02069) |
| **de(DE)** | [Glaser et al. 2021](https://www.scitepress.org/Link.aspx?doi=10.5220/0010246308120821) |
| **de(CH)** | [Niklaus et al. 2021](https://aclanthology.org/2021.nllp-1.3/) |
| **SFU Review** | [Konstantinova et al. 2012](http://www.lrec-conf.org/proceedings/lrec2012/pdf/533_Paper.pdf) |
| **BioScope** | [Szarvas et al. 2008](https://aclanthology.org/W08-0606/) |
| **ConanDoyle-neg**| [Morante and Blanco. 2012](https://aclanthology.org/S12-1035/) |
| **Dalloux** | [Dalloux et al. 2020](https://clementdalloux.fr/?page_id=28) |
### Annotations
The data is annotated for negation cues and their scopes. Annotation guidelines are available [here](https://github.com/RamonaChristen/Multilingual_Negation_Scope_Resolution_on_Legal_Data/blob/main/Annotation_Guidelines.pdf)
#### Annotation process
Each language was annotated by one native speaking annotator and follows strict annotation guidelines
### Citation Information
Please cite the following preprint:
```
@misc{christen2023resolving,
title={Resolving Legalese: A Multilingual Exploration of Negation Scope Resolution in Legal Documents},
author={Ramona Christen and Anastassia Shaitarova and Matthias Stürmer and Joel Niklaus},
year={2023},
eprint={2309.08695},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
Mxode/StackOverflow-QA-C-Language-40k | ---
license: apache-2.0
language:
- en
tags:
- code
task_categories:
- question-answering
size_categories:
- 10K<n<100K
---
This is a collection of ~40k QA's in **C Language** from StackOverflow. The data has been initially cleaned, and each response is with **Accepted Answer**.
All data is **<1000** in length.
The questions and answers were organized into a **one-line** format. A sample format is shown below:
```json
{
"question": "```\nFILE* file = fopen(some file)\n\npcap_t* pd = pcap_fopen_offline(file)\n\npcap_close(pd)\n\nfclose(file)\n```\n\nThis code occurs double free error.\n\nCould you explain about this happening?\n\nMy Guess is that pd and file pointers are sharing some datas.\n",
"answer": "As the documentation says, thepcap_closefunction closes the files associated with thepcap_tstructure passed to it. Closing the file again withfcloseis an error.\n"
}
``` |
ttagu99/ko_f_1871 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 6462525
num_examples: 1871
download_size: 3201114
dataset_size: 6462525
---
# Dataset Card for "ko_f_1871"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amay01/llm-sgd-dst8-train-test-data | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 61828454
num_examples: 175780
- name: test
num_bytes: 61828454
num_examples: 175780
download_size: 0
dataset_size: 123656908
---
# Dataset Card for "llm-sgd-dst8-train-test-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bkai-foundation-models/BKAINewsCorpus | ---
dataset_info:
features:
- name: id
dtype: int64
- name: link
dtype: string
- name: publish
struct:
- name: $date
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 56444149767
num_examples: 16762024
download_size: 28652191009
dataset_size: 56444149767
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "BKAINewsCorpus"
The [Binhvq News Corpus](https://github.com/binhvq/news-corpus), a widely used dataset featuring approximately 20 million articles from diverse sources, received its last update in May 2021. To enhance this collection, we gathered an additional 10 million articles up until November 2023. By integrating these newly acquired articles with the existing [Binhvq News Corpus](https://github.com/binhvq/news-corpus), we have created an extensive Vietnamese News Corpus comprising about 32M articles. Subsequent fuzzy deduplication was conducted to remove duplicate articles, resulting in 53 GB of clean data, which is ready for the continual pretraining of LLMs.
### Please cite our manuscript if this dataset is used for your work
```
@article{duc2024towards,
title={Towards Comprehensive Vietnamese Retrieval-Augmented Generation and Large Language Models},
author={Nguyen Quang Duc, Le Hai Son, Nguyen Duc Nhan, Nguyen Dich Nhat Minh, Le Thanh Huong, Dinh Viet Sang},
journal={arXiv preprint arXiv:2403.01616},
year={2024}
}
``` |
khwrali011/En_Ger_translation | ---
dataset_info:
features:
- name: English
dtype: string
- name: German
dtype: string
splits:
- name: train
num_bytes: 13690179
num_examples: 177226
- name: test
num_bytes: 3420351
num_examples: 44307
download_size: 11787004
dataset_size: 17110530
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_hongzoh__Yi-6B_Open-Platypus-v2 | ---
pretty_name: Evaluation run of hongzoh/Yi-6B_Open-Platypus-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hongzoh/Yi-6B_Open-Platypus-v2](https://huggingface.co/hongzoh/Yi-6B_Open-Platypus-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hongzoh__Yi-6B_Open-Platypus-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T20:35:06.410961](https://huggingface.co/datasets/open-llm-leaderboard/details_hongzoh__Yi-6B_Open-Platypus-v2/blob/main/results_2024-03-29T20-35-06.410961.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5682981374820366,\n\
\ \"acc_stderr\": 0.03331728566573774,\n \"acc_norm\": 0.5770899804690629,\n\
\ \"acc_norm_stderr\": 0.03405502121628935,\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842892,\n \"mc2\": 0.42338274205343635,\n\
\ \"mc2_stderr\": 0.014268690462127283\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4453924914675768,\n \"acc_stderr\": 0.014523987638344085,\n\
\ \"acc_norm\": 0.4991467576791809,\n \"acc_norm_stderr\": 0.014611369529813279\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5151364270065724,\n\
\ \"acc_stderr\": 0.004987494455523726,\n \"acc_norm\": 0.72176857199761,\n\
\ \"acc_norm_stderr\": 0.004472121485161911\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006716,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006716\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7096774193548387,\n\
\ \"acc_stderr\": 0.0258221061194159,\n \"acc_norm\": 0.7096774193548387,\n\
\ \"acc_norm_stderr\": 0.0258221061194159\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419871,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419871\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124495,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124495\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.02977866303775296,\n\
\ \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.02977866303775296\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.02498535492310234,\n \
\ \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.02498535492310234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275794,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275794\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154343,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6813725490196079,\n \"acc_stderr\": 0.032702871814820816,\n \"\
acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.032702871814820816\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598035,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598035\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.04039314978724561,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.04039314978724561\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.046355501356099754,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.046355501356099754\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489298,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489298\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7266922094508301,\n\
\ \"acc_stderr\": 0.015936681062628556,\n \"acc_norm\": 0.7266922094508301,\n\
\ \"acc_norm_stderr\": 0.015936681062628556\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584183,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.028110928492809068,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.028110928492809068\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.02698147804364804,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.02698147804364804\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271143,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271143\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.438722294654498,\n\
\ \"acc_stderr\": 0.012673969883493274,\n \"acc_norm\": 0.438722294654498,\n\
\ \"acc_norm_stderr\": 0.012673969883493274\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.030332578094555033,\n\
\ \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.030332578094555033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5800653594771242,\n \"acc_stderr\": 0.019966811178256483,\n \
\ \"acc_norm\": 0.5800653594771242,\n \"acc_norm_stderr\": 0.019966811178256483\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547724,\n\
\ \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547724\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n\
\ \"acc_stderr\": 0.028996909693328913,\n \"acc_norm\": 0.7860696517412935,\n\
\ \"acc_norm_stderr\": 0.028996909693328913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842892,\n \"mc2\": 0.42338274205343635,\n\
\ \"mc2_stderr\": 0.014268690462127283\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7198105761641673,\n \"acc_stderr\": 0.012621707979798499\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15845337376800606,\n \
\ \"acc_stderr\": 0.010058474790238955\n }\n}\n```"
repo_url: https://huggingface.co/hongzoh/Yi-6B_Open-Platypus-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|arc:challenge|25_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|gsm8k|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hellaswag|10_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|winogrande|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T20-35-06.410961.parquet'
- config_name: results
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- results_2024-03-29T20-35-06.410961.parquet
- split: latest
path:
- results_2024-03-29T20-35-06.410961.parquet
---
# Dataset Card for Evaluation run of hongzoh/Yi-6B_Open-Platypus-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [hongzoh/Yi-6B_Open-Platypus-v2](https://huggingface.co/hongzoh/Yi-6B_Open-Platypus-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hongzoh__Yi-6B_Open-Platypus-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T20:35:06.410961](https://huggingface.co/datasets/open-llm-leaderboard/details_hongzoh__Yi-6B_Open-Platypus-v2/blob/main/results_2024-03-29T20-35-06.410961.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5682981374820366,
"acc_stderr": 0.03331728566573774,
"acc_norm": 0.5770899804690629,
"acc_norm_stderr": 0.03405502121628935,
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842892,
"mc2": 0.42338274205343635,
"mc2_stderr": 0.014268690462127283
},
"harness|arc:challenge|25": {
"acc": 0.4453924914675768,
"acc_stderr": 0.014523987638344085,
"acc_norm": 0.4991467576791809,
"acc_norm_stderr": 0.014611369529813279
},
"harness|hellaswag|10": {
"acc": 0.5151364270065724,
"acc_stderr": 0.004987494455523726,
"acc_norm": 0.72176857199761,
"acc_norm_stderr": 0.004472121485161911
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.04033565667848319,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.04033565667848319
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006716,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006716
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7096774193548387,
"acc_stderr": 0.0258221061194159,
"acc_norm": 0.7096774193548387,
"acc_norm_stderr": 0.0258221061194159
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419871,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419871
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0368105086916155,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0368105086916155
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7823834196891192,
"acc_stderr": 0.02977866303775296,
"acc_norm": 0.7823834196891192,
"acc_norm_stderr": 0.02977866303775296
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.02498535492310234,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.02498535492310234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275794,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275794
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154343,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.032702871814820816,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.032702871814820816
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.046355501356099754,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.046355501356099754
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489298,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489298
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7266922094508301,
"acc_stderr": 0.015936681062628556,
"acc_norm": 0.7266922094508301,
"acc_norm_stderr": 0.015936681062628556
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584183,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.028110928492809068,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.028110928492809068
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.02698147804364804,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.02698147804364804
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271143,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271143
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.438722294654498,
"acc_stderr": 0.012673969883493274,
"acc_norm": 0.438722294654498,
"acc_norm_stderr": 0.012673969883493274
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.030332578094555033,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.030332578094555033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5800653594771242,
"acc_stderr": 0.019966811178256483,
"acc_norm": 0.5800653594771242,
"acc_norm_stderr": 0.019966811178256483
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547724,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547724
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.028996909693328913,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.028996909693328913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842892,
"mc2": 0.42338274205343635,
"mc2_stderr": 0.014268690462127283
},
"harness|winogrande|5": {
"acc": 0.7198105761641673,
"acc_stderr": 0.012621707979798499
},
"harness|gsm8k|5": {
"acc": 0.15845337376800606,
"acc_stderr": 0.010058474790238955
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
TibetanAI/TibetanAI_NERv1.0 | ---
license: apache-2.0
language:
- bo
---
# Dataset Card for TibetanAI_NERv1.0
## Dataset Description
TibetanAI_NERv1.0 is a Tibetan NER dataset. 藏文命名实体识别数据集。
- **Paper: 基于小样本学习的藏文命名实体识别
### Languages
Tibetan
### Licensing Information
apache-2.0
### Citation Information
于韬,张英,拥措.基于小样本学习的藏文命名实体识别[J].计算机与现代化,2023(05):13-19.
### Contributions
Title-题名: 基于小样本学习的藏文命名实体识别
Author-作者: 于韬;张英;拥措;
Organ-单位: 西藏大学信息科学技术学院;西藏大学西藏自治区藏文信息技术人工智能重点实验室;西藏大学藏文信息技术教育部工程研究中心;
|
autoevaluate/autoeval-staging-eval-project-samsum-f90fd7b5-10915466 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: pszemraj/led-large-book-summary
metrics: ['bleu']
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/led-large-book-summary
* Dataset: samsum
* Config: samsum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
thercyl/XOM | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: float64
- name: Ticker
dtype: string
- name: Year
dtype: string
- name: Text
dtype: string
- name: Embedding
dtype: string
splits:
- name: train
num_bytes: 4309445
num_examples: 131
download_size: 2623140
dataset_size: 4309445
---
# Dataset Card for "XOM"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adalib/full-cond-gen | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: api
dtype: string
splits:
- name: train
num_bytes: 34788783.0
num_examples: 6466
download_size: 12767855
dataset_size: 34788783.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
luigisaetta/atco2_only_augmented | ---
license: mit
task_categories:
- automatic-speech-recognition
language:
- en
tags:
- atc
- asr
- en
pretty_name: atco2 augmented
size_categories:
- 1K<n<10K
--- |
pinecone/movielens-recent-ratings | ---
annotations_creators:
- machine-generated
language:
- en
language_creators:
- machine-generated
license: []
multilinguality:
- monolingual
pretty_name: MovieLens User Ratings
size_categories:
- 100K<n<1M
source_datasets: []
tags:
- movielens
- recommendation
- collaborative filtering
task_categories: []
task_ids: []
---
# MovieLens User Ratings
This dataset contains ~1M user ratings, consisting of ~10k of the most recent movies from the MovieLens 25M dataset, for which over 30k unique users have rated. The dataset is streamed from the MovieLens 25M dataset, filters for the recent movies, and returns the user ratings for those. After a few joins and checks, we get this dataset. Included are the URLs of the respective movie posters.
The dataset is part of an example on [building a movie recommendation engine](https://www.pinecone.io/docs/examples/movie-recommender-system/) with vector search. |
IIC/RagQuAS | ---
language:
- es
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
task_categories:
- question-answering
- text-retrieval
task_ids:
- document-retrieval
- extractive-qa
pretty_name: RAGMiscContextual
tags:
- spanish
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: topic
dtype: string
- name: answer
dtype: string
- name: question
dtype: string
- name: variant
dtype: string
- name: context_1
dtype: string
- name: context_2
dtype: string
- name: context_3
dtype: string
- name: context_4
dtype: string
- name: context_5
dtype: string
- name: link_1
dtype: string
- name: link_2
dtype: string
- name: link_3
dtype: string
- name: link_4
dtype: string
- name: link_5
dtype: string
- name: text_1
dtype: string
- name: text_2
dtype: string
- name: text_3
dtype: string
- name: text_4
dtype: string
- name: text_5
dtype: string
splits:
- name: train
num_bytes: 6905998
num_examples: 201
download_size: 1015578
dataset_size: 6905998
---
# Retrieval-Augmented-Generation and Queston-Answering in Spanish (RagQuAS) Dataset
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Leaderboard:** [Leaderboard Somos600M]()
-
- **Point of Contact:** [Instituto de Ingeniería del Conocimiento](contacto.iic@iic.uam.es)
### Dataset Summary
RagQuAS es un dataset de alta calidad con ejemplos en una gran cantidad de dominios: Hobbies, Lingüística, Mascotas, Salud, astronomía, atención al cliente, coches, cotidiano, documentación, energía, esquí, estafas, gastronomía, hobbies, idiomas, juegos, lenguaje, manicura, música, patinaje, primeros auxilios, receta, reciclaje, reclamaciones, seguros, tenis, transporte, turismo, veterinaria, viajes, yoga.
### Supported Tasks and Leaderboards
Está diseñado para evaluar un sistema de RAG al completo.
### Languages
Castellano (BCP-47 es).
## Dataset Structure
### Data Instances
Las instancias de este dataset tienen la siguiente estructura:
| topic | answer | question | variant | context_1 | context_2 | context_3 | context_4 | context_5 | link_1 | link_2 | link_3 | link_4 | link_5 | text_1 | text_3 | text_4 | text_5 |
|:--------------|:--------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------|:-----------|:---------------------------|:-------------------------------------|:---------------------------------------------------|:------------|:------------|:-----------------------------------------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------------|:---------|:---------|:----------------------------------|:-------------------------------------------|:---------|:---------|
| reclamaciones | La opción más fácil y eficaz para reclamar una indemnización por retraso de vuelo en Europa es... | ¿Cuál es la forma más fácil de reclamar cuando un vuelo que sale de España se ha retrasado? | question_1 | #1. Airhelp. La empresa... | En AirHelp hemos ayudado a más de... | MYFLYRIGHT, expertos en derechos de los viajero... | | | https://www.businessinsider.es/mejores-paginas-reclamar-vuelo-cancelado-retrasado-804901 | https://www.airhelp.com/es/retrasos-de-vuelos/ | https://myflyright.com/es/servicios/vuelo-retrasado/ | | | 5 páginas donde poder reclamar... | Indemnización retraso vuelo. Navegación... | | |
### Data Fields
- **topic:** el dominio sobre el que trata el ejemplo.
- **question:** pregunta sobre los documentos.
- **variant:** un indicador de la variante de la pregunta. Cuando dos respuestas "answer" son iguales, quiere decir que ambas filas en el corpus representan la misma consulta, pero formulada con una naturaleza diferente.
- **answer:** respuesta del sistema a cualquiera de las variantes.
- **context_i:** contexto del documento i que se ha utilizado para responder a la pregunta en cualquiera de las variantes.
- **text_i:** texto completo del documento i.
- **link_i:** enlace del documento i.
### Data Splits
El dataset no está dividido en train, validation y test porque está diseñado para evaluar.
| | train |
|-------------------------|------:|
| Input Sentences | 201 |
## Dataset Creation
### Curation Rationale
Los sistemas de RAG son una estructura compleja que involucran la colaboración de varios modelos de inteligencia artificial. Contar con datasets que evaluan dichos sistemas en conjunto es muy valioso a la hora de medir la eficacia en su conjunto.
### Source Data
Los datos se crearon a partir de texto simple extraído de la web, con información de los distintos dominios.
#### Initial Data Collection and Normalization
Para la recolección de los datos se hizo una selección de los textos a partir los dominios elegidos, a los que posteriormente se diseñaron una serie de preguntas, con diferentes variantes, y se seleccionaron los contextos con la información relevante para responder a cada pregunta.
#### Who are the source language producers?
Todo el corpus ha sido generado y revisado por humanos.
### Annotations
La guía de anotación consistió en generar pares de pregunta-respuesta dado un documento y encontrar la información relevante dentro de ellos para obtener los contextos.
#### Annotation process
La metodología de corpus ha consistido en el acuerdo y diseño de las preguntas a realizar sobre los datos y la resolución de dudas.
#### Who are the annotators?
Corpus realizados de forma manual por dos lingüistas computacionales. Las respuestas han sido escritas por cada anotador.
### Personal and Sensitive Information
El dataset está libre de información personal y sensible.
## Considerations for Using the Data
### Social Impact of Dataset
Crear corpus de calidad en castellano es de vital importancia si queremos que la inteligencia artificial de dicho idioma esté a la altura del inglés. La donación de corpus de alta calidad con tareas y dominios variados es lo más relevante a la hora de lograr este objetivo.
### Discussion of Biases
No se ha hecho un análisis de sesgo, por lo que pueden existir algunos sesgos a causa del origen del que provienen los contextos seleccionados.
### Other Known Limitations
[N/A]
## Additional Information
### Dataset Curators
[Instituto de Ingeniería del Conocimiento](https://www.iic.uam.es/iic/) (IIC).
### Licensing Information
Este dataset está bajo la licencia de uso no comercial [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/).
### Citation Information
```
@misc {Instituto de Ingeniería del Conocimiento (IIC),
author = { {Instituto de Ingeniería del Conocimiento} },
title = { Retrieval-Augmented-Generation and Queston-Answering in Spanish (RagQuAS) Dataset },
year = 2024,
url = { https://huggingface.co/datasets/IIC/RagQuAS },
doi = { 10.57967/hf/2044 },
publisher = { Hugging Face }
}
```
### Contributions
Gracias a [@mariagrandury](https://huggingface.co/mariagrandury) por darnos la oportunidad de participar en la creación de un corpus de instrucciones en castellano y lenguas cooficiales para potenciar los modelos de inteligencia artificial en estos idiomas tan ricos, variados y de tanta relevancia. |
CyberHarem/k31_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of k31/K31/K31 (Girls' Frontline)
This is the dataset of k31/K31/K31 (Girls' Frontline), containing 18 images and their tags.
The core tags of this character are `hair_ornament, pink_hair, long_hair, purple_eyes, headphones, breasts, bangs, hair_between_eyes, hair_intakes, x_hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 18 | 21.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k31_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 18 | 10.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k31_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 45 | 23.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k31_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 18 | 18.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k31_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 45 | 37.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k31_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/k31_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, solo, cleavage, holding, smile, looking_at_viewer, simple_background, white_background, blush, black_jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | holding | smile | looking_at_viewer | simple_background | white_background | blush | black_jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:----------|:--------|:--------------------|:--------------------|:-------------------|:--------|:---------------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X |
|
sfcompute/TinyNarrations | ---
viewer: false
license:
- other
dataset_info:
features:
- name: path
dtype: string
- name: audio
dtype: Audio
config_name: default
splits:
- name: train
num_bytes: 783536881667
num_examples: 89112
- name: validation
num_bytes: 16526026753
num_examples: 864
download_size: 800062908420
---
[Blog](https://sfcompute.com/blog/tiny-narrations) | [GitHub](https://github.com/sfcompute/tinynarrations)

```bash
pip install datasets
```
```python
from datasets import load_dataset
val_split = load_dataset('sfcompute/TinyNarrations', split='validation', streaming=True)
train_split = load_dataset('sfcompute/TinyNarrations', split='train', streaming=True)
```
```python
import torch
wav = torch.from_numpy(next(iter(val_split))['audio']['array']).unsqueeze(0)
```
To load audio ensure you have the following installed:
```bash
pip install librosa soundfile
``` |
qgallouedec/prj_gia_dataset_metaworld_stick_pull_v2_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the stick-pull-v2 environment, sample for the policy stick-pull-v2
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
## Load dataset
First, clone it with
```sh
git clone https://huggingface.co/datasets/qgallouedec/prj_gia_dataset_metaworld_stick_pull_v2_1111
```
Then, load it with
```python
import numpy as np
dataset = np.load("prj_gia_dataset_metaworld_stick_pull_v2_1111/dataset.npy", allow_pickle=True).item()
print(dataset.keys()) # dict_keys(['observations', 'actions', 'dones', 'rewards'])
```
|
yzhuang/autotree_automl_covertype_gosdt_d3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: int64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence: int64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5541200000
num_examples: 100000
- name: validation
num_bytes: 554120000
num_examples: 10000
download_size: 959372939
dataset_size: 6095320000
---
# Dataset Card for "autotree_automl_covertype_gosdt_d3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mesolitica/unsupervised-malay-youtube-speaker-diarization | ---
language:
- ms
---
# Unsupervised malay speakers from youtube videos
10492 unique speakers with at least 75 hours of voice activities. Steps to reproduce at https://github.com/huseinzol05/malaya-speech/blob/master/data/youtube/process-youtube.ipynb
## how-to
1. Download and extract [processed-youtube.tar.gz](processed-youtube.tar.gz), each processed videos saved as pickle, `{video_name}.pkl`.
2. Each pickle file got,
```python
[{'wav_data': '/home/husein/ssd2/processed-youtube-v2/"Abam_peluk_saya_lama_atas_pentas_akhir_MLM"-_Ali_Puteh_menangis_imbau_saat_manis_dengan_arwah_abang-_MdgGr7VD7w/0.mp3',
'timestamp': datetime.datetime(2023, 3, 2, 18, 45, 45, 778042),
'asr_model': ('kenapa tak mahu bangun kau abang',
[0.5325799628135358],
[309, 9, 399, 633, 108, 252]),
'classification_model': (array([ 3.02432757e-03, -3.64390127e-02, 2.93319039e-02, -2.84599233e-02,
-5.04244901e-02, 6.03185333e-02, 7.04260264e-03, 7.36895157e-03,
2.41034012e-02, -3.31214964e-02, -1.61228217e-02, -1.92081463e-02,
-1.77928973e-02, 1.05488757e-02, 5.11314301e-03, 2.08497643e-02,
2.80407351e-02, -1.34683009e-02, 1.10213496e-02, -5.76948654e-03,
2.11171638e-02, -3.10498872e-03, 1.60899870e-02, -2.22061612e-02,
-3.09270490e-02, 1.03673469e-02, 2.29822248e-02, 5.44358939e-02,
-9.44061391e-03, 3.24469656e-02, -1.40673192e-02, 6.55731931e-03,
1.94134321e-02, 2.31755860e-02, -8.62774719e-03, -3.72681394e-03,
-3.17485556e-02, -1.12474747e-02, 1.65595114e-02, 2.31244415e-02,
3.28784771e-02, 8.52510054e-03, -6.41896739e-04, 3.13562714e-03,
-3.15982029e-02, 1.72785181e-03, 1.58039071e-02, -9.93900001e-03,
2.03248486e-02, -2.98949536e-02, 3.53759155e-02, 3.06809470e-02,
-3.68881435e-03, -3.98267582e-02, -2.07101982e-02, 2.51877047e-02,
-2.51530181e-03, 1.06034977e-02, 1.24978041e-02, 2.35916697e-03,
1.31300613e-02, -1.62451845e-02, -2.09861826e-02, 3.17490734e-02,
-1.18532358e-02, 4.25735563e-02, 4.17908467e-02, 1.21251179e-03,
-3.85571155e-03, -9.50544327e-03, -7.37808086e-03, 2.63940021e-02,
1.09219365e-02, 3.05683501e-02, -4.08848785e-02, -1.71920974e-02,
-1.46033484e-02, -3.29053291e-05, 3.84788848e-02, -7.86552951e-03,
1.01251132e-03, 2.72140447e-02, 2.52339337e-02, 3.39004360e-02,
-1.38184745e-02, 2.60320995e-02, -1.01425601e-02, -1.16012329e-02,
4.30319924e-03, -1.01203052e-02, -4.66396799e-03, -2.64480542e-02,
3.44322808e-02, -4.64622118e-03, 1.06053520e-02, 1.37923108e-02,
-2.05409434e-03, -1.19995829e-02, 2.10450366e-02, -2.87155900e-03,
-1.39515549e-02, -1.51185887e-02, 2.29053162e-02, -1.78178120e-02,
1.95855577e-03, 2.37271357e-02, 2.80657201e-03, -6.08753460e-03,
-2.01220363e-02, 3.22612897e-02, 1.82474777e-02, 5.31493872e-02,
-7.08705634e-02, 2.76431069e-03, 1.03597697e-02, -3.53837833e-02,
1.38167264e-02, -5.91275143e-03, 1.84398554e-02, 6.05177172e-02,
1.14565976e-02, 1.56977493e-02, -1.82731878e-02, -4.58574407e-02,
-1.08330613e-02, -1.16500622e-02, -1.19803764e-04, 6.48374185e-02,
-1.21538760e-03, -5.41793741e-02, 1.38867721e-02, 3.52845751e-02,
-2.08288375e-02, 1.03750750e-02, -2.17110049e-02, 2.29265504e-02,
-1.21381739e-02, -1.47071329e-03, -4.36875001e-02, -2.25690063e-02,
-4.16939743e-02, -8.39853752e-03, -2.06098761e-02, 2.30504461e-02,
3.48615423e-02, -4.18495797e-02, -2.41985917e-03, -3.18994140e-03,
1.22078639e-02, -9.50168632e-03, -1.97298196e-03, 1.30731370e-02,
2.07234323e-02, 1.08521534e-02, 2.30542179e-02, -2.54045837e-02,
1.45645533e-02, -1.08493539e-02, -1.30415503e-02, 3.29123251e-02,
3.46204527e-02, 2.58748885e-04, -1.28235819e-03, -1.32823242e-02,
5.47284493e-03, -2.62062326e-02, 2.31803600e-02, -2.04505119e-02,
2.32407395e-02, 2.12946888e-02, -1.28869051e-02, -6.81399694e-03,
5.68802692e-02, 4.31004271e-04, 1.67261921e-02, 2.93559525e-02,
1.32581135e-02, -9.03073605e-03, -9.38207190e-03, 1.74718127e-02,
1.72506981e-02, 5.02267219e-02, -1.32851647e-02, 5.07321544e-02,
-1.87530685e-02, 4.18599546e-02, 1.50075918e-02, -2.61102356e-02,
-1.59594957e-02, 1.36823149e-03, -9.64679196e-03, 1.71130225e-02],
dtype=float32),
'speaker 0')}]
```
- all mp3 files postprocessing using https://malaya-speech.readthedocs.io/en/latest/load-noise-reduction.html and https://malaya-speech.readthedocs.io/en/latest/load-speech-enhancement.html
- `wav_data` is directory of the audio, prune the path to proper extracted directory.
- `asr_model` is predicted using the best model that we have, `conformer-medium`, returned `(text, probability, subwords)`, https://malaya-speech.readthedocs.io/en/latest/load-stt-transducer-model-pt.html
- `classification_model` is predicted using NEMO TITANET Large speaker verification model, https://catalog.ngc.nvidia.com/orgs/nvidia/teams/nemo/models/titanet_large, with streaming speaker similarity, https://malaya-speech.readthedocs.io/en/latest/huggingface-repository.html
3. Group by similar speakers using pagerank method (scipy.sparse.linalg.gmres),
- 90% similar, from 10492 unique speakers become 6085 unique speakers, https://github.com/huseinzol05/malaya-speech/blob/master/data/youtube/mapping-youtube-speakers-90.json
- 85% similar, from 10492 unique speakers become 4312 unique speakers, https://github.com/huseinzol05/malaya-speech/blob/master/data/youtube/mapping-youtube-speakers-85.json
- 80% similar, from 10492 unique speakers become 2912 unique speakers, https://github.com/huseinzol05/malaya-speech/blob/master/data/youtube/mapping-youtube-speakers-80.json
Speaker name defined as,
```python
import os
import pickle
pkl = 'filename.pkl'
with open(pkl, 'rb') as fopen:
data = pickle.load(fopen)
filename = os.path.split(pkl)[1].replace('.pkl', '')
for result in data:
speaker_name = f'{filename}-{speaker}'
actual_speaker = unique_speakers[speaker_name]
```
Check example at https://github.com/huseinzol05/malaya-speech/blob/master/data/youtube/calculate-lengths-80.ipynb |
MohamedRashad/multilingual-tts | ---
license: gpl-3.0
dataset_info:
features:
- name: text
dtype: string
- name: speaker
dtype: string
- name: languages
dtype: string
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 1561588634.72
num_examples: 25540
download_size: 1548036818
dataset_size: 1561588634.72
task_categories:
- text-to-speech
language:
- ar
- en
- zh
- es
- fr
- hi
- ru
- pt
- ja
- de
- tr
- bn
- id
- ur
- vi
pretty_name: Multilingual TTS
size_categories:
- 10K<n<100K
---
# Before Anything and Everything ⚱
_In the time of writing this Dataset Card, ~**17,490**~ **18,412** civilian has been killed in Palestine (~**7,870**~ **8,000** are children and ~**6,121**~ **6,200** are women)._
**Se**ek **a**ny **n**on-**pro**fit **organi**zation **t**o **he**lp **th**em **wi**th **wh**at **y**ou **c**an (For myself, [I use Mersal](https://www.every.org/mersal/f/support-humanitarian)) 🇵🇸
## Dataset Description
The Multilingual TTS dataset is an exceptional compilation of text-to-speech (TTS) samples, meticulously crafted to showcase the richness and diversity of human languages. This dataset encompasses a variety of real-world sentences in fifteen prominent languages, carefully chosen to reflect global linguistic diversity. Each sample is accompanied by its corresponding high-quality audio output.
<style>
.image-container {
display: flex;
justify-content: center;
align-items: center;
height: 65vh;
margin: 0;
}
.image-container img {
max-width: 48%; /* Adjust the width as needed */
height: auto;
}
</style>
<div class="image-container">
<img src="https://cdn-uploads.huggingface.co/production/uploads/6116d0584ef9fdfbf45dc4d9/UX0s8S2yWSJ3NbbvmOJOi.png">
<img src="https://cdn-uploads.huggingface.co/production/uploads/6116d0584ef9fdfbf45dc4d9/zIyPCWH7Y58gLVCeIfq4n.png">
</div>
## Key Features:
1. **Language Diversity**: The dataset covers a spectrum of languages, including **Beng**ali, **Mand**arin **Chin**ese, **Turk**ish, **Hin**di, **Fre**nch, **Vietn**amese, **Portu**guese, **Span**ish, **Japa**nese, **Ger**man, **Russ**ian, **Indon**esian, **Stan**dard **Ara**bic, **Engl**ish, **a**nd **Ur**du. This wide linguistic representation ensures inclusivity and applicability to a global audience.
3. **Real-World Sentences**: Comprising 25,000 samples, the dataset mirrors authentic communication scenarios. Sentences span diverse topics, ranging from everyday conversations to informative texts and news snippets, providing a comprehensive linguistic landscape.
4. **Multilingual Sentences**: A distinctive feature of this dataset is its inclusion of sentences that seamlessly integrate multiple languages. Each sample combines at least two languages, capturing the intricate dynamics of multilingual communication and rendering the dataset particularly valuable for training and evaluating multilingual TTS systems.
5. **Audio Quality**: Special attention has been given to the audio quality of each sample. The audio outputs are meticulously designed to be clear, natural-sounding, and faithful representations of the corresponding text, ensuring a rich auditory experience.
6. **Generated by GPT-4 and elevenlabs**: The dataset is the result of a collaboration between GPT-4 and elevenlabs, combining cutting-edge language generation capabilities with domain expertise. This collaboration guarantees a high level of accuracy, coherence, and linguistic nuance in both the text and audio components.
## Potential Use Cases:
1. **Multilingual TTS Model Training**: Researchers and developers can leverage this dataset to train and refine multilingual TTS models, enhancing their proficiency across a diverse array of languages.
2. **Cross-Language Evaluation**: The dataset serves as a valuable resource for evaluating TTS systems in handling multilingual scenarios, offering a benchmark for assessing model capabilities across different languages.
3. **Language Integration Testing**: Developers working on applications requiring multilingual TTS functionality can utilize this dataset to test and optimize language integration, ensuring a seamless user experience across various linguistic contexts.
## Acknowledgments:
The creation of the Multilingual TTS dataset was made possible through the collaborative efforts of **OpenAI's GPT-4** and the expertise of **Elevenlabs Multilingual V2**. We extend our gratitude to the AI and language processing communities for their continuous support in advancing the field of multilingual TTS. This dataset stands as a significant contribution, fostering innovation and progress in language technologies.
|
Vtmpas/calc-qa-augment-sft-2-tokenized | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 1669248
num_examples: 9936
download_size: 45446
dataset_size: 1669248
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "calc-qa-augment-sft-2-tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AsameerI/desc_title_1k | ---
dataset_info:
features:
- name: title
dtype: string
- name: description
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1771600
num_examples: 1000
download_size: 1202580
dataset_size: 1771600
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
adityarra07/train_data_20000 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 3370249038.032651
num_examples: 20000
- name: test
num_bytes: 33702564.98032651
num_examples: 200
download_size: 3324093596
dataset_size: 3403951603.0129776
---
# Dataset Card for "train_data_20000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
justquick/pdf12step | ---
license: apache-2.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.